S2RF: Semantically Stylized Radiance Fields
- URL: http://arxiv.org/abs/2309.01252v1
- Date: Sun, 3 Sep 2023 19:32:49 GMT
- Title: S2RF: Semantically Stylized Radiance Fields
- Authors: Dishani Lahiri, Neeraj Panse, Moneish Kumar
- Abstract summary: We present our method for transferring style from any arbitrary image(s) to object(s) within a 3D scene.
Our primary objective is to offer more control in 3D scene stylization, facilitating the creation of customizable and stylized scene images from arbitrary viewpoints.
- Score: 1.243080988483032
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present our method for transferring style from any arbitrary image(s) to
object(s) within a 3D scene. Our primary objective is to offer more control in
3D scene stylization, facilitating the creation of customizable and stylized
scene images from arbitrary viewpoints. To achieve this, we propose a novel
approach that incorporates nearest neighborhood-based loss, allowing for
flexible 3D scene reconstruction while effectively capturing intricate style
details and ensuring multi-view consistency.
Related papers
- G3DST: Generalizing 3D Style Transfer with Neural Radiance Fields across Scenes and Styles [45.92812062685523]
Existing methods for 3D style transfer need extensive per-scene optimization for single or multiple styles.
In this work, we overcome the limitations of existing methods by rendering stylized novel views from a NeRF without the need for per-scene or per-style optimization.
Our findings demonstrate that this approach achieves a good visual quality comparable to that of per-scene methods.
arXiv Detail & Related papers (2024-08-24T08:04:19Z) - StyleSplat: 3D Object Style Transfer with Gaussian Splatting [0.3374875022248866]
Style transfer can enhance 3D assets with diverse artistic styles, transforming creative expression.
We introduce StyleSplat, a method for stylizing 3D objects in scenes represented by 3D Gaussians from reference style images.
We demonstrate its effectiveness across various 3D scenes and styles, showcasing enhanced control and customization in 3D creation.
arXiv Detail & Related papers (2024-07-12T17:55:08Z) - PNeSM: Arbitrary 3D Scene Stylization via Prompt-Based Neural Style
Mapping [16.506819625584654]
3D scene stylization refers to transform the appearance of a 3D scene to match a given style image.
Several existing methods have obtained impressive results in stylizing 3D scenes.
We propose a novel 3D scene stylization framework to transfer an arbitrary style to an arbitrary scene.
arXiv Detail & Related papers (2024-03-13T05:08:47Z) - S-DyRF: Reference-Based Stylized Radiance Fields for Dynamic Scenes [58.05447927353328]
Current 3D stylization methods often assume static scenes, which violates the dynamic nature of our real world.
We present S-DyRF, a reference-based temporal stylization method for dynamic neural fields.
Experiments on both synthetic and real-world datasets demonstrate that our method yields plausible stylized results.
arXiv Detail & Related papers (2024-03-10T13:04:01Z) - SceneWiz3D: Towards Text-guided 3D Scene Composition [134.71933134180782]
Existing approaches either leverage large text-to-image models to optimize a 3D representation or train 3D generators on object-centric datasets.
We introduce SceneWiz3D, a novel approach to synthesize high-fidelity 3D scenes from text.
arXiv Detail & Related papers (2023-12-13T18:59:30Z) - Towards 4D Human Video Stylization [56.33756124829298]
We present a first step towards 4D (3D and time) human video stylization, which addresses style transfer, novel view synthesis and human animation.
We leverage Neural Radiance Fields (NeRFs) to represent videos, conducting stylization in the rendered feature space.
Our framework uniquely extends its capabilities to accommodate novel poses and viewpoints, making it a versatile tool for creative human video stylization.
arXiv Detail & Related papers (2023-12-07T08:58:33Z) - StyleRF: Zero-shot 3D Style Transfer of Neural Radiance Fields [52.19291190355375]
StyleRF (Style Radiance Fields) is an innovative 3D style transfer technique.
It employs an explicit grid of high-level features to represent 3D scenes, with which high-fidelity geometry can be reliably restored via volume rendering.
It transforms the grid features according to the reference style which directly leads to high-quality zero-shot style transfer.
arXiv Detail & Related papers (2023-03-19T08:26:06Z) - ARF: Artistic Radiance Fields [63.79314417413371]
We present a method for transferring the artistic features of an arbitrary style image to a 3D scene.
Previous methods that perform 3D stylization on point clouds or meshes are sensitive to geometric reconstruction errors.
We propose to stylize the more robust radiance field representation.
arXiv Detail & Related papers (2022-06-13T17:55:31Z) - StyleMesh: Style Transfer for Indoor 3D Scene Reconstructions [11.153966202832933]
We apply style transfer on mesh reconstructions of indoor scenes.
This enables VR applications like experiencing 3D environments painted in the style of a favorite artist.
arXiv Detail & Related papers (2021-12-02T18:59:59Z) - Learning to Stylize Novel Views [82.24095446809946]
We tackle a 3D scene stylization problem - generating stylized images of a scene from arbitrary novel views.
We propose a point cloud-based method for consistent 3D scene stylization.
arXiv Detail & Related papers (2021-05-27T23:58:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.