LumiTex: Towards High-Fidelity PBR Texture Generation with Illumination Context
- URL: http://arxiv.org/abs/2511.19437v1
- Date: Mon, 24 Nov 2025 18:59:58 GMT
- Title: LumiTex: Towards High-Fidelity PBR Texture Generation with Illumination Context
- Authors: Jingzhi Bao, Hongze Chen, Lingting Zhu, Chenyu Liu, Runze Zhang, Keyang Luo, Zeyu Hu, Weikai Chen, Yingda Yin, Xin Wang, Zehong Lin, Jun Zhang, Xiaoguang Han,
- Abstract summary: Physically-based rendering (PBR) provides a principled standard for realistic material-lighting interactions in computer graphics.<n>LumiTex is an end-to-end framework that disentangles albedo and metallic-roughness under shared illumination priors.<n>LumiTex achieves state-of-the-art performance in texture quality, surpassing both existing open-source and commercial methods.
- Score: 40.98472333599951
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physically-based rendering (PBR) provides a principled standard for realistic material-lighting interactions in computer graphics. Despite recent advances in generating PBR textures, existing methods fail to address two fundamental challenges: 1) materials decomposition from image prompts under limited illumination cues, and 2) seamless and view-consistent texture completion. To this end, we propose LumiTex, an end-to-end framework that comprises three key components: (1) a multi-branch generation scheme that disentangles albedo and metallic-roughness under shared illumination priors for robust material understanding, (2) a lighting-aware material attention mechanism that injects illumination context into the decoding process for physically grounded generation of albedo, metallic, and roughness maps, and (3) a geometry-guided inpainting module based on a large view synthesis model that enriches texture coverage and ensures seamless, view-consistent UV completion. Extensive experiments demonstrate that LumiTex achieves state-of-the-art performance in texture quality, surpassing both existing open-source and commercial methods.
Related papers
- LaFiTe: A Generative Latent Field for 3D Native Texturing [72.05710323154288]
Existing native approaches are sparse by the absence of a powerful and versatile representation, which severely limits the fidelity and generality of their generated textures.<n>We introduce LaFiTe, which generates high-quality textures constrained by a sparse color representation and UV parameterization.
arXiv Detail & Related papers (2025-12-04T13:33:49Z) - MaterialRefGS: Reflective Gaussian Splatting with Multi-view Consistent Material Inference [83.38607296779423]
We show that multi-view consistent material inference with more physically-based environment modeling is key to learning accurate reflections with Gaussian Splatting.<n>Our method faithfully recovers both illumination and geometry, achieving state-of-the-art rendering quality in novel views synthesis.
arXiv Detail & Related papers (2025-10-13T13:29:20Z) - SeqTex: Generate Mesh Textures in Video Sequence [62.766839821764144]
We introduce SeqTex, a novel end-to-end framework for training 3D texture generative models.<n>We show that SeqTex achieves state-of-the-art performance on both image-conditioned and text-conditioned 3D texture generation tasks.
arXiv Detail & Related papers (2025-07-06T07:58:36Z) - ePBR: Extended PBR Materials in Image Synthesis [2.3241174970798126]
Intrinsic image representation offers a well-balanced trade-off, decomposing images into fundamental components.<n>We extend intrinsic image representations to incorporate both reflection and transmission properties.<n>With the Extended PBR (ePBR) Materials, we can effectively edit the materials with precise controls.
arXiv Detail & Related papers (2025-04-23T19:15:42Z) - IntrinsiX: High-Quality PBR Generation using Image Priors [49.90007540430264]
We introduce IntrinsiX, a novel method that generates high-quality intrinsic images from text description.<n>In contrast to existing text-to-image models whose outputs contain baked-in scene lighting, our approach predicts physically-based rendering (PBR) maps.
arXiv Detail & Related papers (2025-04-01T17:47:48Z) - RomanTex: Decoupling 3D-aware Rotary Positional Embedded Multi-Attention Network for Texture Synthesis [10.350576861948952]
RomanTex is a multiview-based texture generation framework that integrates a multi-attention network with an underlying 3D representation.<n>Our method achieves state-of-the-art results in texture quality and consistency.
arXiv Detail & Related papers (2025-03-24T17:56:11Z) - MaterialMVP: Illumination-Invariant Material Generation via Multi-view PBR Diffusion [37.596740171045845]
Physically-based rendering (PBR) has become a cornerstone in modern computer graphics, enabling realistic material representation and lighting interactions in 3D scenes.<n>We present a novel end-to-end model for generating PBR textures from 3D meshes and image prompts, addressing key challenges in multi-view material synthesis.
arXiv Detail & Related papers (2025-03-13T11:57:30Z) - MCMat: Multiview-Consistent and Physically Accurate PBR Material Generation [30.69364954074992]
UNet-based diffusion models to generate multi-view physically rendering PBR maps but struggle with multi-view inconsistency, some 3D methods directly generate UV maps, issues due to the 3D data.<n>In the stage, we propose to generate PBR materials, where both the specially designed Transformer DiDi) model to generate PBR materials feature reference views.
arXiv Detail & Related papers (2024-12-18T18:45:35Z) - OpenMaterial: A Large-scale Dataset of Complex Materials for 3D Reconstruction [55.052637670485716]
We introduce OpenMaterial, a large-scale semi-synthetic dataset for material-aware 3D reconstruction.<n>It comprises 1,001 objects spanning 295 distinct materials, including conductors, dielectrics, plastics, and their roughened variants, captured under 714 diverse lighting conditions.<n>It provides multi-view images, 3D shape models, camera poses, depth maps, and object masks, establishing the first extensive benchmark for evaluating 3D reconstruction on challenging materials.
arXiv Detail & Related papers (2024-06-13T07:46:17Z) - Enhancing Texture Generation with High-Fidelity Using Advanced Texture
Priors [1.4542583614606408]
We propose a high-resolution and high-fidelity texture restoration technique that uses the rough texture as the initial input.
We also introduce a background noise smoothing technique based on a self-supervised scheme to address the noise problem in current high-resolution texture synthesis schemes.
Our approach enables high-resolution texture synthesis, paving the way for high-definition and high-detail texture synthesis technology.
arXiv Detail & Related papers (2024-03-08T07:07:28Z) - FlashTex: Fast Relightable Mesh Texturing with LightControlNet [105.4683880648901]
We introduce LightControlNet, a new text-to-image model based on the ControlNet architecture.
We apply our approach to disentangle material/reflectance in the resulting texture so that the mesh can be properlylit and rendered in any lighting environment.
arXiv Detail & Related papers (2024-02-20T18:59:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.