Vastextures: Vast repository of textures and PBR materials extracted from real-world images using unsupervised methods
- URL: http://arxiv.org/abs/2406.17146v1
- Date: Mon, 24 Jun 2024 21:36:01 GMT
- Title: Vastextures: Vast repository of textures and PBR materials extracted from real-world images using unsupervised methods
- Authors: Sagi Eppel,
- Abstract summary: Vastextures is a repository of 500,000 textures and PBR materials extracted from real-world images using an unsupervised process.
The repository is composed of 2D textures cropped from natural images and SVBRDF/PBR materials generated from these textures.
- Score: 0.6993026261767287
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Vastextures is a vast repository of 500,000 textures and PBR materials extracted from real-world images using an unsupervised process. The extracted materials and textures are extremely diverse and cover a vast range of real-world patterns, but at the same time less refined compared to existing repositories. The repository is composed of 2D textures cropped from natural images and SVBRDF/PBR materials generated from these textures. Textures and PBR materials are essential for CGI. Existing materials repositories focus on games, animation, and arts, that demand a limited amount of high-quality assets. However, virtual worlds and synthetic data are becoming increasingly important for training A.I systems for computer vision. This application demands a huge amount of diverse assets but at the same time less affected by noisy and unrefined assets. Vastexture aims to address this need by creating a free, huge, and diverse assets repository that covers as many real-world materials as possible. The materials are automatically extracted from natural images in two steps: 1) Automatically scanning a giant amount of images to identify and crop regions with uniform textures. This is done by splitting the image into a grid of cells and identifying regions in which all of the cells share a similar statistical distribution. 2) Extracting the properties of the PBR material from the cropped texture. This is done by randomly guessing every correlation between the properties of the texture image and the properties of the PBR material. The resulting PBR materials exhibit a vast amount of real-world patterns as well as unexpected emergent properties. Neutral nets trained on this repository outperformed nets trained using handcrafted assets.
Related papers
- On Synthetic Texture Datasets: Challenges, Creation, and Curation [1.9567015559455132]
We create a dataset of 362,880 texture images that span 56 textures.
During the process of generating images, we find that NSFW safety filters in image generation pipelines are highly sensitive to texture.
arXiv Detail & Related papers (2024-09-16T14:02:18Z) - Meta 3D TextureGen: Fast and Consistent Texture Generation for 3D Objects [54.80813150893719]
We introduce Meta 3D TextureGen: a new feedforward method comprised of two sequential networks aimed at generating high-quality textures in less than 20 seconds.
Our method state-of-the-art results in quality and speed by conditioning a text-to-image model on 3D semantics in 2D space and fusing them into a complete and high-resolution UV texture map.
In addition, we introduce a texture enhancement network that is capable of up-scaling any texture by an arbitrary ratio, producing 4k pixel resolution textures.
arXiv Detail & Related papers (2024-07-02T17:04:34Z) - Infinite Texture: Text-guided High Resolution Diffusion Texture Synthesis [61.189479577198846]
We present Infinite Texture, a method for generating arbitrarily large texture images from a text prompt.
Our approach fine-tunes a diffusion model on a single texture, and learns to embed that statistical distribution in the output domain of the model.
At generation time, our fine-tuned diffusion model is used through a score aggregation strategy to generate output texture images of arbitrary resolution on a single GPU.
arXiv Detail & Related papers (2024-05-13T21:53:09Z) - MaterialSeg3D: Segmenting Dense Materials from 2D Priors for 3D Assets [63.284244910964475]
We propose a 3D asset material generation framework to infer underlying material from the 2D semantic prior.
Based on such a prior model, we devise a mechanism to parse material in 3D space.
arXiv Detail & Related papers (2024-04-22T07:00:17Z) - Learning Zero-Shot Material States Segmentation, by Implanting Natural Image Patterns in Synthetic Data [0.555174246084229]
This work aims to bridge the gap by infusing patterns automatically extracted from real-world images into synthetic data.
We present the first comprehensive benchmark for zero-shot material state segmentation.
We also share 300,000 extracted textures and SVBRDF/PBR materials to facilitate future generation.
arXiv Detail & Related papers (2024-03-05T20:21:49Z) - TextureDreamer: Image-guided Texture Synthesis through Geometry-aware
Diffusion [64.49276500129092]
TextureDreamer is an image-guided texture synthesis method.
It can transfer relightable textures from a small number of input images to target 3D shapes across arbitrary categories.
arXiv Detail & Related papers (2024-01-17T18:55:49Z) - Paint-it: Text-to-Texture Synthesis via Deep Convolutional Texture Map Optimization and Physically-Based Rendering [47.78392889256976]
Paint-it is a text-driven high-fidelity texture map synthesis method for 3D rendering.
Paint-it synthesizes texture maps from a text description by synthesis-through-optimization, exploiting the Score-Distillation Sampling (SDS)
We show that DC-PBR inherently schedules the optimization curriculum according to texture frequency and naturally filters out the noisy signals from SDS.
arXiv Detail & Related papers (2023-12-18T17:17:08Z) - Material Palette: Extraction of Materials from a Single Image [19.410479434979493]
We propose a method to extract physically-based rendering (PBR) materials from a single real-world image.
We map regions of the image to material concepts using a diffusion model, which allows the sampling of texture images resembling each material in the scene.
Second, we benefit from a separate network to decompose the generated textures into Spatially Varying BRDFs.
arXiv Detail & Related papers (2023-11-28T18:59:58Z) - TwinTex: Geometry-aware Texture Generation for Abstracted 3D
Architectural Models [13.248386665044087]
We present TwinTex, the first automatic texture mapping framework to generate a photo-realistic texture for a piece-wise planar proxy.
Our approach surpasses state-of-the-art texture mapping methods in terms of high-fidelity quality and reaches a human-expert production level with much less effort.
arXiv Detail & Related papers (2023-09-20T12:33:53Z) - TEXTure: Text-Guided Texturing of 3D Shapes [71.13116133846084]
We present TEXTure, a novel method for text-guided editing, editing, and transfer of textures for 3D shapes.
We define a trimap partitioning process that generates seamless 3D textures without requiring explicit surface textures.
arXiv Detail & Related papers (2023-02-03T13:18:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.