TexPro: Text-guided PBR Texturing with Procedural Material Modeling
- URL: http://arxiv.org/abs/2410.15891v1
- Date: Mon, 21 Oct 2024 11:10:07 GMT
- Title: TexPro: Text-guided PBR Texturing with Procedural Material Modeling
- Authors: Ziqiang Dang, Wenqi Dong, Zesong Yang, Bangbang Yang, Liang Li, Yuewen Ma, Zhaopeng Cui,
- Abstract summary: TexPro is a novel method for high-fidelity material generation for input 3D meshes given text prompts.
We first generate multi-view reference images given the input textual prompt by employing the latest text-to-image model.
We derive texture maps through a rendering-based optimization with recent differentiable procedural materials.
- Score: 23.8905505397344
- License:
- Abstract: In this paper, we present TexPro, a novel method for high-fidelity material generation for input 3D meshes given text prompts. Unlike existing text-conditioned texture generation methods that typically generate RGB textures with baked lighting, TexPro is able to produce diverse texture maps via procedural material modeling, which enables physical-based rendering, relighting, and additional benefits inherent to procedural materials. Specifically, we first generate multi-view reference images given the input textual prompt by employing the latest text-to-image model. We then derive texture maps through a rendering-based optimization with recent differentiable procedural materials. To this end, we design several techniques to handle the misalignment between the generated multi-view images and 3D meshes, and introduce a novel material agent that enhances material classification and matching by exploring both part-level understanding and object-aware material reasoning. Experiments demonstrate the superiority of the proposed method over existing SOTAs and its capability of relighting.
Related papers
- TexGen: Text-Guided 3D Texture Generation with Multi-view Sampling and Resampling [37.67373829836975]
We present TexGen, a novel multi-view sampling and resampling framework for texture generation.
Our proposed method produces significantly better texture quality for diverse 3D objects with a high degree of view consistency.
Our proposed texture generation technique can also be applied to texture editing while preserving the original identity.
arXiv Detail & Related papers (2024-08-02T14:24:40Z) - Infinite Texture: Text-guided High Resolution Diffusion Texture Synthesis [61.189479577198846]
We present Infinite Texture, a method for generating arbitrarily large texture images from a text prompt.
Our approach fine-tunes a diffusion model on a single texture, and learns to embed that statistical distribution in the output domain of the model.
At generation time, our fine-tuned diffusion model is used through a score aggregation strategy to generate output texture images of arbitrary resolution on a single GPU.
arXiv Detail & Related papers (2024-05-13T21:53:09Z) - MaPa: Text-driven Photorealistic Material Painting for 3D Shapes [80.66880375862628]
This paper aims to generate materials for 3D meshes from text descriptions.
Unlike existing methods that synthesize texture maps, we propose to generate segment-wise procedural material graphs.
Our framework supports high-quality rendering and provides substantial flexibility in editing.
arXiv Detail & Related papers (2024-04-26T17:54:38Z) - MatAtlas: Text-driven Consistent Geometry Texturing and Material Assignment [11.721314027024547]
MatAtlas is a method for consistent text-guided 3D model.
By proposing a multi-step texture refinement process, we significantly improve the quality and consistency.
arXiv Detail & Related papers (2024-04-03T17:57:15Z) - TexRO: Generating Delicate Textures of 3D Models by Recursive Optimization [54.59133974444805]
TexRO is a novel method for generating delicate textures of a known 3D mesh by optimizing its UV texture.
We demonstrate the superior performance of TexRO in terms of texture quality, detail preservation, visual consistency, and, notably runtime speed.
arXiv Detail & Related papers (2024-03-22T07:45:51Z) - TextureDreamer: Image-guided Texture Synthesis through Geometry-aware
Diffusion [64.49276500129092]
TextureDreamer is an image-guided texture synthesis method.
It can transfer relightable textures from a small number of input images to target 3D shapes across arbitrary categories.
arXiv Detail & Related papers (2024-01-17T18:55:49Z) - TexFusion: Synthesizing 3D Textures with Text-Guided Image Diffusion
Models [77.85129451435704]
We present a new method to synthesize textures for 3D, using large-scale-guided image diffusion models.
Specifically, we leverage latent diffusion models, apply the set denoising model and aggregate denoising text map.
arXiv Detail & Related papers (2023-10-20T19:15:29Z) - Text2Tex: Text-driven Texture Synthesis via Diffusion Models [31.773823357617093]
We present Text2Tex, a novel method for generating high-quality textures for 3D meshes from text prompts.
Our method incorporates inpainting into a pre-trained depth-aware image diffusion model to progressively synthesize high resolution partial textures from multiple viewpoints.
arXiv Detail & Related papers (2023-03-20T19:02:13Z) - TEXTure: Text-Guided Texturing of 3D Shapes [71.13116133846084]
We present TEXTure, a novel method for text-guided editing, editing, and transfer of textures for 3D shapes.
We define a trimap partitioning process that generates seamless 3D textures without requiring explicit surface textures.
arXiv Detail & Related papers (2023-02-03T13:18:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.