Locally Stylized Neural Radiance Fields
- URL: http://arxiv.org/abs/2309.10684v1
- Date: Tue, 19 Sep 2023 15:08:10 GMT
- Title: Locally Stylized Neural Radiance Fields
- Authors: Hong-Wing Pang, Binh-Son Hua, Sai-Kit Yeung
- Abstract summary: We propose a stylization framework for neural radiance fields (NeRF) based on local style transfer.
In particular, we use a hash-grid encoding to learn the embedding of the appearance and geometry components.
We show that our method yields plausible stylization results with novel view synthesis.
- Score: 30.037649804991315
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, there has been increasing interest in applying stylization
on 3D scenes from a reference style image, in particular onto neural radiance
fields (NeRF). While performing stylization directly on NeRF guarantees
appearance consistency over arbitrary novel views, it is a challenging problem
to guide the transfer of patterns from the style image onto different parts of
the NeRF scene. In this work, we propose a stylization framework for NeRF based
on local style transfer. In particular, we use a hash-grid encoding to learn
the embedding of the appearance and geometry components, and show that the
mapping defined by the hash table allows us to control the stylization to a
certain extent. Stylization is then achieved by optimizing the appearance
branch while keeping the geometry branch fixed. To support local style
transfer, we propose a new loss function that utilizes a segmentation network
and bipartite matching to establish region correspondences between the style
image and the content images obtained from volume rendering. Our experiments
show that our method yields plausible stylization results with novel view
synthesis while having flexible controllability via manipulating and
customizing the region correspondences.
Related papers
- Instant Photorealistic Neural Radiance Fields Stylization [1.039189397779466]
We present Instant Neural Radiance Fields Stylization, a novel approach for multi-view image stylization for the 3D scene.
Our approach models a neural radiance field based on neural graphics primitives, which use a hash table-based position encoder for position embedding.
Our method can generate stylized novel views with a consistent appearance at various view angles in less than 10 minutes on modern GPU hardware.
arXiv Detail & Related papers (2023-03-29T17:53:20Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - StyleTRF: Stylizing Tensorial Radiance Fields [7.9020917073764405]
Stylized view generation of scenes captured casually using a camera has received much attention recently.
We present StyleTRF, a compact, quick-to-optimize strategy for stylized view generation using TensoRF.
arXiv Detail & Related papers (2022-12-19T09:50:05Z) - NeRF-Art: Text-Driven Neural Radiance Fields Stylization [38.3724634394761]
We present NeRF-Art, a text-guided NeRF stylization approach that manipulates the style of a pre-trained NeRF model with a simple text prompt.
We show that our method is effective and robust regarding both single-view stylization quality and cross-view consistency.
arXiv Detail & Related papers (2022-12-15T18:59:58Z) - Learning Graph Neural Networks for Image Style Transfer [131.73237185888215]
State-of-the-art parametric and non-parametric style transfer approaches are prone to either distorted local style patterns due to global statistics alignment, or unpleasing artifacts resulting from patch mismatching.
In this paper, we study a novel semi-parametric neural style transfer framework that alleviates the deficiency of both parametric and non-parametric stylization.
arXiv Detail & Related papers (2022-07-24T07:41:31Z) - ARF: Artistic Radiance Fields [63.79314417413371]
We present a method for transferring the artistic features of an arbitrary style image to a 3D scene.
Previous methods that perform 3D stylization on point clouds or meshes are sensitive to geometric reconstruction errors.
We propose to stylize the more robust radiance field representation.
arXiv Detail & Related papers (2022-06-13T17:55:31Z) - StylizedNeRF: Consistent 3D Scene Stylization as Stylized NeRF via 2D-3D
Mutual Learning [50.65015652968839]
3D scene stylization aims at generating stylized images of the scene from arbitrary novel views.
Thanks to recently proposed neural radiance fields (NeRF), we are able to represent a 3D scene in a consistent way.
We propose a novel mutual learning framework for 3D scene stylization that combines a 2D image stylization network and NeRF.
arXiv Detail & Related papers (2022-05-24T16:29:50Z) - Learning to Stylize Novel Views [82.24095446809946]
We tackle a 3D scene stylization problem - generating stylized images of a scene from arbitrary novel views.
We propose a point cloud-based method for consistent 3D scene stylization.
arXiv Detail & Related papers (2021-05-27T23:58:18Z) - Stylizing 3D Scene via Implicit Representation and HyperNetwork [34.22448260525455]
A straightforward solution is to combine existing novel view synthesis and image/video style transfer approaches.
Inspired by the high quality results of the neural radiance fields (NeRF) method, we propose a joint framework to directly render novel views with the desired style.
Our framework consists of two components: an implicit representation of the 3D scene with the neural radiance field model, and a hypernetwork to transfer the style information into the scene representation.
arXiv Detail & Related papers (2021-05-27T09:11:30Z) - Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality
Artistic Style Transfer [115.13853805292679]
Artistic style transfer aims at migrating the style from an example image to a content image.
Inspired by the common painting process of drawing a draft and revising the details, we introduce a novel feed-forward method named Laplacian Pyramid Network (LapStyle)
Our method can synthesize high quality stylized images in real time, where holistic style patterns are properly transferred.
arXiv Detail & Related papers (2021-04-12T11:53:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.