DreamSpace: Dreaming Your Room Space with Text-Driven Panoramic Texture
Propagation
- URL: http://arxiv.org/abs/2310.13119v1
- Date: Thu, 19 Oct 2023 19:29:23 GMT
- Title: DreamSpace: Dreaming Your Room Space with Text-Driven Panoramic Texture
Propagation
- Authors: Bangbang Yang, Wenqi Dong, Lin Ma, Wenbo Hu, Xiao Liu, Zhaopeng Cui,
Yuewen Ma
- Abstract summary: In this paper, we propose a novel framework to generate 3D textures for immersive VR experiences.
To survive, we separate texture cues in confidential regions and learn to network textures in real-world environments.
- Score: 31.353409149640605
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion-based methods have achieved prominent success in generating 2D
media. However, accomplishing similar proficiencies for scene-level mesh
texturing in 3D spatial applications, e.g., XR/VR, remains constrained,
primarily due to the intricate nature of 3D geometry and the necessity for
immersive free-viewpoint rendering. In this paper, we propose a novel indoor
scene texturing framework, which delivers text-driven texture generation with
enchanting details and authentic spatial coherence. The key insight is to first
imagine a stylized 360{\deg} panoramic texture from the central viewpoint of
the scene, and then propagate it to the rest areas with inpainting and
imitating techniques. To ensure meaningful and aligned textures to the scene,
we develop a novel coarse-to-fine panoramic texture generation approach with
dual texture alignment, which both considers the geometry and texture cues of
the captured scenes. To survive from cluttered geometries during texture
propagation, we design a separated strategy, which conducts texture inpainting
in confidential regions and then learns an implicit imitating network to
synthesize textures in occluded and tiny structural areas. Extensive
experiments and the immersive VR application on real-world indoor scenes
demonstrate the high quality of the generated textures and the engaging
experience on VR headsets. Project webpage:
https://ybbbbt.com/publication/dreamspace
Related papers
- SceneCraft: Layout-Guided 3D Scene Generation [29.713491313796084]
SceneCraft is a novel method for generating detailed indoor scenes that adhere to textual descriptions and spatial layout preferences.
Our method significantly outperforms existing approaches in complex indoor scene generation with diverse textures, consistent geometry, and realistic visual quality.
arXiv Detail & Related papers (2024-10-11T17:59:58Z) - Meta 3D TextureGen: Fast and Consistent Texture Generation for 3D Objects [54.80813150893719]
We introduce Meta 3D TextureGen: a new feedforward method comprised of two sequential networks aimed at generating high-quality textures in less than 20 seconds.
Our method state-of-the-art results in quality and speed by conditioning a text-to-image model on 3D semantics in 2D space and fusing them into a complete and high-resolution UV texture map.
In addition, we introduce a texture enhancement network that is capable of up-scaling any texture by an arbitrary ratio, producing 4k pixel resolution textures.
arXiv Detail & Related papers (2024-07-02T17:04:34Z) - RoomTex: Texturing Compositional Indoor Scenes via Iterative Inpainting [34.827355403635536]
We propose a 3D scene framework referred to as RoomTex.
RoomTex generates high-fidelity and style-consistent textures for un-consistent scene meshes.
We propose to maintain superior alignment between RGB and edge detection methods.
arXiv Detail & Related papers (2024-06-04T16:27:09Z) - FlashTex: Fast Relightable Mesh Texturing with LightControlNet [105.4683880648901]
We introduce LightControlNet, a new text-to-image model based on the ControlNet architecture.
We apply our approach to disentangle material/reflectance in the resulting texture so that the mesh can be properlylit and rendered in any lighting environment.
arXiv Detail & Related papers (2024-02-20T18:59:00Z) - TextureDreamer: Image-guided Texture Synthesis through Geometry-aware
Diffusion [64.49276500129092]
TextureDreamer is an image-guided texture synthesis method.
It can transfer relightable textures from a small number of input images to target 3D shapes across arbitrary categories.
arXiv Detail & Related papers (2024-01-17T18:55:49Z) - SceneTex: High-Quality Texture Synthesis for Indoor Scenes via Diffusion
Priors [49.03627933561738]
SceneTex is a novel method for generating high-quality and style-consistent textures for indoor scenes using depth-to-image diffusion priors.
SceneTex enables various and accurate texture synthesis for 3D-FRONT scenes, demonstrating significant improvements in visual quality and prompt fidelity over the prior texture generation methods.
arXiv Detail & Related papers (2023-11-28T22:49:57Z) - Text2Scene: Text-driven Indoor Scene Stylization with Part-aware Details [12.660352353074012]
We propose Text2Scene, a method to automatically create realistic textures for virtual scenes composed of multiple objects.
Our pipeline adds detailed texture on labeled 3D geometries in the room such that the generated colors respect the hierarchical structure or semantic parts that are often composed of similar materials.
arXiv Detail & Related papers (2023-08-31T17:37:23Z) - RoomDreamer: Text-Driven 3D Indoor Scene Synthesis with Coherent
Geometry and Texture [80.0643976406225]
We propose "RoomDreamer", which leverages powerful natural language to synthesize a new room with a different style.
Our work addresses the challenge of synthesizing both geometry and texture aligned to the input scene structure and prompt simultaneously.
To validate the proposed method, real indoor scenes scanned with smartphones are used for extensive experiments.
arXiv Detail & Related papers (2023-05-18T22:57:57Z) - AUV-Net: Learning Aligned UV Maps for Texture Transfer and Synthesis [78.17671694498185]
We propose AUV-Net which learns to embed 3D surfaces into a 2D aligned UV space.
As a result, textures are aligned across objects, and can thus be easily synthesized by generative models of images.
The learned UV mapping and aligned texture representations enable a variety of applications including texture transfer, texture synthesis, and textured single view 3D reconstruction.
arXiv Detail & Related papers (2022-04-06T21:39:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.