Mesh2Tex: Generating Mesh Textures from Image Queries
- URL: http://arxiv.org/abs/2304.05868v1
- Date: Wed, 12 Apr 2023 13:58:25 GMT
- Title: Mesh2Tex: Generating Mesh Textures from Image Queries
- Authors: Alexey Bokhovkin, Shubham Tulsiani, Angela Dai
- Abstract summary: In particular, textured stage textures from images of real objects match real images observations.
We present Mesh2Tex, which learns object geometry from uncorrelated collections of 3D object geometry.
- Score: 45.32242590651395
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Remarkable advances have been achieved recently in learning neural
representations that characterize object geometry, while generating textured
objects suitable for downstream applications and 3D rendering remains at an
early stage. In particular, reconstructing textured geometry from images of
real objects is a significant challenge -- reconstructed geometry is often
inexact, making realistic texturing a significant challenge. We present
Mesh2Tex, which learns a realistic object texture manifold from uncorrelated
collections of 3D object geometry and photorealistic RGB images, by leveraging
a hybrid mesh-neural-field texture representation. Our texture representation
enables compact encoding of high-resolution textures as a neural field in the
barycentric coordinate system of the mesh faces. The learned texture manifold
enables effective navigation to generate an object texture for a given 3D
object geometry that matches to an input RGB image, which maintains robustness
even under challenging real-world scenarios where the mesh geometry
approximates an inexact match to the underlying geometry in the RGB image.
Mesh2Tex can effectively generate realistic object textures for an object mesh
to match real images observations towards digitization of real environments,
significantly improving over previous state of the art.
Related papers
- 3DTextureTransformer: Geometry Aware Texture Generation for Arbitrary
Mesh Topology [1.4349415652822481]
Learning to generate textures for a novel 3D mesh given a collection of 3D meshes and real-world 2D images is an important problem with applications in various domains such as 3D simulation, augmented and virtual reality, gaming, architecture, and design.
Existing solutions either do not produce high-quality textures or deform the original high-resolution input mesh topology into a regular grid to make this generation easier but also lose the original mesh topology.
We present a novel framework called the 3DTextureTransformer that enables us to generate high-quality textures without deforming the original, high-resolution input mesh.
arXiv Detail & Related papers (2024-03-07T05:01:07Z) - TextureDreamer: Image-guided Texture Synthesis through Geometry-aware
Diffusion [64.49276500129092]
TextureDreamer is an image-guided texture synthesis method.
It can transfer relightable textures from a small number of input images to target 3D shapes across arbitrary categories.
arXiv Detail & Related papers (2024-01-17T18:55:49Z) - TMO: Textured Mesh Acquisition of Objects with a Mobile Device by using
Differentiable Rendering [54.35405028643051]
We present a new pipeline for acquiring a textured mesh in the wild with a single smartphone.
Our method first introduces an RGBD-aided structure from motion, which can yield filtered depth maps.
We adopt the neural implicit surface reconstruction method, which allows for high-quality mesh.
arXiv Detail & Related papers (2023-03-27T10:07:52Z) - Delicate Textured Mesh Recovery from NeRF via Adaptive Surface
Refinement [78.48648360358193]
We present a novel framework that generates textured surface meshes from images.
Our approach begins by efficiently initializing the geometry and view-dependency appearance with a NeRF.
We jointly refine the appearance with geometry and bake it into texture images for real-time rendering.
arXiv Detail & Related papers (2023-03-03T17:14:44Z) - Learning Neural Implicit Representations with Surface Signal
Parameterizations [14.835882967340968]
We present a neural network architecture that implicitly encodes the underlying surface parameterization suitable for appearance data.
Our model remains compatible with existing mesh-based digital content with appearance data.
arXiv Detail & Related papers (2022-11-01T15:10:58Z) - Texturify: Generating Textures on 3D Shape Surfaces [34.726179801982646]
We propose Texturify to learn a 3D shape that predicts texture on the 3D input.
Our method does not require any 3D color supervision to learn 3D objects.
arXiv Detail & Related papers (2022-04-05T18:00:04Z) - Projective Urban Texturing [8.349665441428925]
We propose a method for automatic generation of textures for 3D city meshes in immersive urban environments.
Projective Urban Texturing (PUT) re-targets textural style from real-world panoramic images to unseen urban meshes.
PUT relies on contrastive and adversarial training of a neural architecture designed for unpaired image-to-texture translation.
arXiv Detail & Related papers (2022-01-25T14:56:52Z) - OSTeC: One-Shot Texture Completion [86.23018402732748]
We propose an unsupervised approach for one-shot 3D facial texture completion.
The proposed approach rotates an input image in 3D and fill-in the unseen regions by reconstructing the rotated image in a 2D face generator.
We frontalize the target image by projecting the completed texture into the generator.
arXiv Detail & Related papers (2020-12-30T23:53:26Z) - Pix2Surf: Learning Parametric 3D Surface Models of Objects from Images [64.53227129573293]
We investigate the problem of learning to generate 3D parametric surface representations for novel object instances, as seen from one or more views.
We design neural networks capable of generating high-quality parametric 3D surfaces which are consistent between views.
Our method is supervised and trained on a public dataset of shapes from common object categories.
arXiv Detail & Related papers (2020-08-18T06:33:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.