ABC-GS: Alignment-Based Controllable Style Transfer for 3D Gaussian Splatting
- URL: http://arxiv.org/abs/2503.22218v1
- Date: Fri, 28 Mar 2025 08:07:57 GMT
- Title: ABC-GS: Alignment-Based Controllable Style Transfer for 3D Gaussian Splatting
- Authors: Wenjie Liu, Zhongliang Liu, Xiaoyan Yang, Man Sha, Yang Li,
- Abstract summary: We introduce ABC-GS, a novel framework based on 3D Gaussian Splatting to achieve high-quality 3D style transfer.<n>A controllable matching stage is designed to achieve precise alignment between scene content and style features.<n>A style transfer loss function based on feature alignment is proposed to ensure that the outcomes of style transfer accurately reflect the global style of the reference image.
- Score: 4.735758456213964
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: 3D scene stylization approaches based on Neural Radiance Fields (NeRF) achieve promising results by optimizing with Nearest Neighbor Feature Matching (NNFM) loss. However, NNFM loss does not consider global style information. In addition, the implicit representation of NeRF limits their fine-grained control over the resulting scenes. In this paper, we introduce ABC-GS, a novel framework based on 3D Gaussian Splatting to achieve high-quality 3D style transfer. To this end, a controllable matching stage is designed to achieve precise alignment between scene content and style features through segmentation masks. Moreover, a style transfer loss function based on feature alignment is proposed to ensure that the outcomes of style transfer accurately reflect the global style of the reference image. Furthermore, the original geometric information of the scene is preserved with the depth loss and Gaussian regularization terms. Extensive experiments show that our ABC-GS provides controllability of style transfer and achieves stylization results that are more faithfully aligned with the global style of the chosen artistic reference. Our homepage is available at https://vpx-ecnu.github.io/ABC-GS-website.
Related papers
- WaSt-3D: Wasserstein-2 Distance for Scene-to-Scene Stylization on 3D Gaussians [37.139479729087896]
We develop a new style transfer method for 3D scenes called WaSt-3D.
It faithfully transfers details from style scenes to the content scene without requiring any training.
WaSt-3D consistently delivers results across diverse content and style scenes without necessitating any training.
arXiv Detail & Related papers (2024-09-26T15:02:50Z) - G-Style: Stylized Gaussian Splatting [5.363168481735954]
We introduce G-Style, a novel algorithm designed to transfer the style of an image onto a 3D scene represented using Gaussian Splatting.
G-Style generates high-quality stylizations within just a few minutes, outperforming existing methods both qualitatively and quantitatively.
arXiv Detail & Related papers (2024-08-28T10:43:42Z) - Reference-based Controllable Scene Stylization with Gaussian Splatting [30.321151430263946]
Referenced-based scene stylization that edits the appearance based on a content-aligned reference image is an emerging research area.
We propose ReGS, which adapts 3D Gaussian Splatting (3DGS) for reference-based stylization to enable real-time stylized view synthesis.
arXiv Detail & Related papers (2024-07-09T20:30:29Z) - CoARF: Controllable 3D Artistic Style Transfer for Radiance Fields [7.651502365257349]
We introduce Controllable Artistic Radiance Fields (CoARF), a novel algorithm for controllable 3D scene stylization.
CoARF provides user-specified controllability of style transfer and superior style transfer quality with more precise feature matching.
arXiv Detail & Related papers (2024-04-23T12:22:32Z) - StylizedGS: Controllable Stylization for 3D Gaussian Splatting [53.0225128090909]
StylizedGS is an efficient 3D neural style transfer framework with adaptable control over perceptual factors.
Our method achieves high-quality stylization results characterized by faithful brushstrokes and geometric consistency with flexible controls.
arXiv Detail & Related papers (2024-04-08T06:32:11Z) - GaussianStyle: Gaussian Head Avatar via StyleGAN [64.85782838199427]
We propose a novel framework that integrates the volumetric strengths of 3DGS with the powerful implicit representation of StyleGAN.
We show that our method achieves state-of-the-art performance in reenactment, novel view synthesis, and animation.
arXiv Detail & Related papers (2024-02-01T18:14:42Z) - Locally Stylized Neural Radiance Fields [30.037649804991315]
We propose a stylization framework for neural radiance fields (NeRF) based on local style transfer.
In particular, we use a hash-grid encoding to learn the embedding of the appearance and geometry components.
We show that our method yields plausible stylization results with novel view synthesis.
arXiv Detail & Related papers (2023-09-19T15:08:10Z) - StyleRF: Zero-shot 3D Style Transfer of Neural Radiance Fields [52.19291190355375]
StyleRF (Style Radiance Fields) is an innovative 3D style transfer technique.
It employs an explicit grid of high-level features to represent 3D scenes, with which high-fidelity geometry can be reliably restored via volume rendering.
It transforms the grid features according to the reference style which directly leads to high-quality zero-shot style transfer.
arXiv Detail & Related papers (2023-03-19T08:26:06Z) - Learning Graph Neural Networks for Image Style Transfer [131.73237185888215]
State-of-the-art parametric and non-parametric style transfer approaches are prone to either distorted local style patterns due to global statistics alignment, or unpleasing artifacts resulting from patch mismatching.
In this paper, we study a novel semi-parametric neural style transfer framework that alleviates the deficiency of both parametric and non-parametric stylization.
arXiv Detail & Related papers (2022-07-24T07:41:31Z) - ARF: Artistic Radiance Fields [63.79314417413371]
We present a method for transferring the artistic features of an arbitrary style image to a 3D scene.
Previous methods that perform 3D stylization on point clouds or meshes are sensitive to geometric reconstruction errors.
We propose to stylize the more robust radiance field representation.
arXiv Detail & Related papers (2022-06-13T17:55:31Z) - Third Time's the Charm? Image and Video Editing with StyleGAN3 [70.36056009463738]
StyleGAN is arguably one of the most intriguing and well-studied generative models.
We explore the recent StyleGAN3 architecture, compare it to its predecessor, and investigate its unique advantages, as well as drawbacks.
arXiv Detail & Related papers (2022-01-31T18:44:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.