Multi-level Dynamic Style Transfer for NeRFs
- URL: http://arxiv.org/abs/2510.00592v1
- Date: Wed, 01 Oct 2025 07:19:27 GMT
- Title: Multi-level Dynamic Style Transfer for NeRFs
- Authors: Zesheng Li, Shuaibo Li, Wei Ma, Jianwei Guo, Hongbin Zha,
- Abstract summary: MDS-NeRF is a novel approach that reengineers the NeRF pipeline specifically for stylization.<n>We propose a multi-level feature adaptor that helps generate a multi-level feature grid representation from the content radiance field.<n>We also present a dynamic style injection module that learns to extract relevant style features and adaptively integrates them into the content patterns.
- Score: 40.439070690681
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As the application of neural radiance fields (NeRFs) in various 3D vision tasks continues to expand, numerous NeRF-based style transfer techniques have been developed. However, existing methods typically integrate style statistics into the original NeRF pipeline, often leading to suboptimal results in both content preservation and artistic stylization. In this paper, we present multi-level dynamic style transfer for NeRFs (MDS-NeRF), a novel approach that reengineers the NeRF pipeline specifically for stylization and incorporates an innovative dynamic style injection module. Particularly, we propose a multi-level feature adaptor that helps generate a multi-level feature grid representation from the content radiance field, effectively capturing the multi-scale spatial structure of the scene. In addition, we present a dynamic style injection module that learns to extract relevant style features and adaptively integrates them into the content patterns. The stylized multi-level features are then transformed into the final stylized view through our proposed multi-level cascade decoder. Furthermore, we extend our 3D style transfer method to support omni-view style transfer using 3D style references. Extensive experiments demonstrate that MDS-NeRF achieves outstanding performance for 3D style transfer, preserving multi-scale spatial structures while effectively transferring stylistic characteristics.
Related papers
- One-Shot Refiner: Boosting Feed-forward Novel View Synthesis via One-Step Diffusion [57.824020826432815]
We present a novel framework for high-fidelity novel view synthesis (NVS) from sparse images.<n>We design a Dual-Domain Detail Perception Module, which enables handling high-resolution images without being limited by the ViT backbone.<n>We develop a feature-guided diffusion network, which can preserve high-frequency details during the restoration process.
arXiv Detail & Related papers (2026-01-20T17:11:55Z) - SSGaussian: Semantic-Aware and Structure-Preserving 3D Style Transfer [57.723850794113055]
We propose a novel 3D style transfer pipeline that integrates prior knowledge from pretrained 2D diffusion models.<n>Our pipeline consists of two key stages: First, we leverage diffusion priors to generate stylized renderings of key viewpoints.<n>The second is instance-level style transfer, which effectively leverages instance-level consistency across stylized key views and transfers it onto the 3D representation.
arXiv Detail & Related papers (2025-09-04T16:40:44Z) - G3DST: Generalizing 3D Style Transfer with Neural Radiance Fields across Scenes and Styles [45.92812062685523]
Existing methods for 3D style transfer need extensive per-scene optimization for single or multiple styles.
In this work, we overcome the limitations of existing methods by rendering stylized novel views from a NeRF without the need for per-scene or per-style optimization.
Our findings demonstrate that this approach achieves a good visual quality comparable to that of per-scene methods.
arXiv Detail & Related papers (2024-08-24T08:04:19Z) - Style-NeRF2NeRF: 3D Style Transfer From Style-Aligned Multi-View Images [54.56070204172398]
We propose a simple yet effective pipeline for stylizing a 3D scene.
We perform 3D style transfer by refining the source NeRF model using stylized images generated by a style-aligned image-to-image diffusion model.
We demonstrate that our method can transfer diverse artistic styles to real-world 3D scenes with competitive quality.
arXiv Detail & Related papers (2024-06-19T09:36:18Z) - ArtNeRF: A Stylized Neural Field for 3D-Aware Cartoonized Face Synthesis [11.463969116010183]
ArtNeRF is a novel face stylization framework derived from 3D-aware GAN.
We propose an expressive generator to synthesize stylized faces and a triple-branch discriminator module to improve style consistency.
Experiments demonstrate that ArtNeRF is versatile in generating high-quality 3D-aware cartoon faces with arbitrary styles.
arXiv Detail & Related papers (2024-04-21T16:45:35Z) - FPRF: Feed-Forward Photorealistic Style Transfer of Large-Scale 3D
Neural Radiance Fields [23.705795612467956]
FPRF stylizes large-scale 3D scenes with arbitrary, multiple style reference images without additional optimization.
FPRF achieves favorable photorealistic quality 3D scene stylization for large-scale scenes with diverse reference images.
arXiv Detail & Related papers (2024-01-10T19:27:28Z) - MM-NeRF: Multimodal-Guided 3D Multi-Style Transfer of Neural Radiance Field [23.050381521558414]
We propose a novel Multimodal-guided 3D Multi-style transfer of NeRF, termed MM-NeRF.<n> MM-NeRF projects multimodal guidance into a unified space to keep the multimodal styles consistency and extracts multimodal features to guide the 3D stylization.<n>Experiments on several real-world datasets show that MM-NeRF achieves high-quality 3D multi-style stylization with multimodal guidance.
arXiv Detail & Related papers (2023-09-24T11:04:50Z) - StyleRF: Zero-shot 3D Style Transfer of Neural Radiance Fields [52.19291190355375]
StyleRF (Style Radiance Fields) is an innovative 3D style transfer technique.
It employs an explicit grid of high-level features to represent 3D scenes, with which high-fidelity geometry can be reliably restored via volume rendering.
It transforms the grid features according to the reference style which directly leads to high-quality zero-shot style transfer.
arXiv Detail & Related papers (2023-03-19T08:26:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.