GetMesh: A Controllable Model for High-quality Mesh Generation and Manipulation
- URL: http://arxiv.org/abs/2403.11990v1
- Date: Mon, 18 Mar 2024 17:25:36 GMT
- Title: GetMesh: A Controllable Model for High-quality Mesh Generation and Manipulation
- Authors: Zhaoyang Lyu, Ben Fei, Jinyi Wang, Xudong Xu, Ya Zhang, Weidong Yang, Bo Dai,
- Abstract summary: Mesh is a fundamental representation of 3D assets in various industrial applications, and is widely supported by professional softwares.
We propose a highly controllable generative model, GetMesh, for mesh generation and manipulation across different categories.
- Score: 25.42531640985281
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Mesh is a fundamental representation of 3D assets in various industrial applications, and is widely supported by professional softwares. However, due to its irregular structure, mesh creation and manipulation is often time-consuming and labor-intensive. In this paper, we propose a highly controllable generative model, GetMesh, for mesh generation and manipulation across different categories. By taking a varying number of points as the latent representation, and re-organizing them as triplane representation, GetMesh generates meshes with rich and sharp details, outperforming both single-category and multi-category counterparts. Moreover, it also enables fine-grained control over the generation process that previous mesh generative models cannot achieve, where changing global/local mesh topologies, adding/removing mesh parts, and combining mesh parts across categories can be intuitively, efficiently, and robustly accomplished by adjusting the number, positions or features of latent points. Project page is https://getmesh.github.io.
Related papers
- GenUDC: High Quality 3D Mesh Generation with Unsigned Dual Contouring Representation [13.923644541595893]
3D generative models generate high-quality meshes with complex structures and realistic surfaces.
We propose the GenUDC framework to address these challenges by leveraging the Unsigned Dual Contouring (UDC) as the mesh representation.
In addition, GenUDC adopts a two-stage, coarse-to-fine generative process for 3D mesh generation.
arXiv Detail & Related papers (2024-10-23T11:59:49Z) - SpaceMesh: A Continuous Representation for Learning Manifold Surface Meshes [61.110517195874074]
We present a scheme to directly generate manifold, polygonal meshes of complex connectivity as the output of a neural network.
Our key innovation is to define a continuous latent connectivity space at each mesh, which implies the discrete mesh.
In applications, this approach not only yields high-quality outputs from generative models, but also enables directly learning challenging geometry processing tasks such as mesh repair.
arXiv Detail & Related papers (2024-09-30T17:59:03Z) - MeshAnything V2: Artist-Created Mesh Generation With Adjacent Mesh Tokenization [65.15226276553891]
MeshAnything V2 is an advanced mesh generation model designed to create Artist-Created Meshes.
A key innovation behind MeshAnything V2 is our novel Adjacent Mesh Tokenization (AMT) method.
arXiv Detail & Related papers (2024-08-05T15:33:45Z) - MeshAnything: Artist-Created Mesh Generation with Autoregressive Transformers [76.70891862458384]
We introduce MeshAnything, a model that treats mesh extraction as a generation problem.
By converting 3D assets in any 3D representation into AMs, MeshAnything can be integrated with various 3D asset production methods.
Our method generates AMs with hundreds of times fewer faces, significantly improving storage, rendering, and simulation efficiencies.
arXiv Detail & Related papers (2024-06-14T16:30:25Z) - MeshXL: Neural Coordinate Field for Generative 3D Foundation Models [51.1972329762843]
We present a family of generative pre-trained auto-regressive models, which addresses the process of 3D mesh generation with modern large language model approaches.
MeshXL is able to generate high-quality 3D meshes, and can also serve as foundation models for various down-stream applications.
arXiv Detail & Related papers (2024-05-31T14:35:35Z) - PivotMesh: Generic 3D Mesh Generation via Pivot Vertices Guidance [66.40153183581894]
We introduce a generic and scalable mesh generation framework PivotMesh.
PivotMesh makes an initial attempt to extend the native mesh generation to large-scale datasets.
We show that PivotMesh can generate compact and sharp 3D meshes across various categories.
arXiv Detail & Related papers (2024-05-27T07:13:13Z) - Mesh Draping: Parametrization-Free Neural Mesh Transfer [92.55503085245304]
Mesh Draping is a neural method for transferring existing mesh structure from one shape to another.
We show that by leveraging gradually increasing frequencies to guide the neural optimization, we are able to achieve stable and high quality mesh transfer.
arXiv Detail & Related papers (2021-10-11T17:24:52Z) - Fully Convolutional Mesh Autoencoder using Efficient Spatially Varying
Kernels [41.81187438494441]
We propose a non-template-specific fully convolutional mesh autoencoder for arbitrary registered mesh data.
Our model outperforms state-of-the-art methods on reconstruction accuracy.
arXiv Detail & Related papers (2020-06-08T02:30:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.