NeuMesh: Learning Disentangled Neural Mesh-based Implicit Field for
Geometry and Texture Editing
- URL: http://arxiv.org/abs/2207.11911v1
- Date: Mon, 25 Jul 2022 05:30:50 GMT
- Title: NeuMesh: Learning Disentangled Neural Mesh-based Implicit Field for
Geometry and Texture Editing
- Authors: Bangbang Yang, Chong Bao, Junyi Zeng, Hujun Bao, Yinda Zhang, Zhaopeng
Cui, Guofeng Zhang
- Abstract summary: We present a novel mesh-based representation by encoding the neural implicit field with disentangled geometry and texture codes on mesh vertices.
We develop several techniques including learnable sign indicators to magnify spatial distinguishability of mesh-based representation.
Experiments and editing examples on both real and synthetic data demonstrate the superiority of our method on representation quality and editing ability.
- Score: 39.71252429542249
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Very recently neural implicit rendering techniques have been rapidly evolved
and shown great advantages in novel view synthesis and 3D scene reconstruction.
However, existing neural rendering methods for editing purposes offer limited
functionality, e.g., rigid transformation, or not applicable for fine-grained
editing for general objects from daily lives. In this paper, we present a novel
mesh-based representation by encoding the neural implicit field with
disentangled geometry and texture codes on mesh vertices, which facilitates a
set of editing functionalities, including mesh-guided geometry editing,
designated texture editing with texture swapping, filling and painting
operations. To this end, we develop several techniques including learnable sign
indicators to magnify spatial distinguishability of mesh-based representation,
distillation and fine-tuning mechanism to make a steady convergence, and the
spatial-aware optimization strategy to realize precise texture editing.
Extensive experiments and editing examples on both real and synthetic data
demonstrate the superiority of our method on representation quality and editing
ability. Code is available on the project webpage:
https://zju3dv.github.io/neumesh/.
Related papers
- LEMON: Localized Editing with Mesh Optimization and Neural Shaders [0.5499187928849248]
We propose LEMON, a mesh editing pipeline that combines neural deferred shading with localized mesh optimization.
We evaluate our pipeline using the DTU dataset, demonstrating that it generates finely-edited meshes more rapidly than the current state-of-the-art methods.
arXiv Detail & Related papers (2024-09-18T14:34:06Z) - Texture-GS: Disentangling the Geometry and Texture for 3D Gaussian Splatting Editing [79.10630153776759]
3D Gaussian splatting, emerging as a groundbreaking approach, has drawn increasing attention for its capabilities of high-fidelity reconstruction and real-time rendering.
We propose a novel approach, namely Texture-GS, to disentangle the appearance from the geometry by representing it as a 2D texture mapped onto the 3D surface.
Our method not only facilitates high-fidelity appearance editing but also achieves real-time rendering on consumer-level devices.
arXiv Detail & Related papers (2024-03-15T06:42:55Z) - Mesh-Guided Neural Implicit Field Editing [42.78979161815414]
We propose a new approach that employs a mesh as a guiding mechanism in editing the neural field.
We first introduce a differentiable method using marching tetrahedra for polygonal mesh extraction from the neural implicit field.
We then design a differentiable color extractor to assign colors obtained from the volume renderings to this extracted mesh.
This differentiable colored mesh allows gradient back-propagation from the explicit mesh to the implicit fields, empowering users to easily manipulate the geometry and color of neural implicit fields.
arXiv Detail & Related papers (2023-12-04T18:59:58Z) - Text-Guided 3D Face Synthesis -- From Generation to Editing [53.86765812392627]
We propose a unified text-guided framework from face generation to editing.
We employ a fine-tuned texture diffusion model to enhance texture quality in both RGB and YUV space.
We propose a self-guided consistency weight strategy to improve editing efficacy while preserving consistency.
arXiv Detail & Related papers (2023-12-01T06:36:23Z) - Neural Impostor: Editing Neural Radiance Fields with Explicit Shape
Manipulation [49.852533321916844]
We introduce Neural Impostor, a hybrid representation incorporating an explicit tetrahedral mesh alongside a multigrid implicit field.
Our framework bridges the explicit shape manipulation and the geometric editing of implicit fields by utilizing multigrid barycentric coordinate encoding.
We show the robustness and adaptability of our system through diverse examples and experiments, including the editing of both synthetic objects and real captured data.
arXiv Detail & Related papers (2023-10-09T04:07:00Z) - DreamEditor: Text-Driven 3D Scene Editing with Neural Fields [115.07896366760876]
We propose a novel framework that enables users to edit neural fields using text prompts.
DreamEditor generates highly realistic textures and geometry, significantly surpassing previous works in both quantitative and qualitative evaluations.
arXiv Detail & Related papers (2023-06-23T11:53:43Z) - SINE: Semantic-driven Image-based NeRF Editing with Prior-guided Editing
Field [37.8162035179377]
We present a novel semantic-driven NeRF editing approach, which enables users to edit a neural radiance field with a single image.
To achieve this goal, we propose a prior-guided editing field to encode fine-grained geometric and texture editing in 3D space.
Our method achieves photo-realistic 3D editing using only a single edited image, pushing the bound of semantic-driven editing in 3D real-world scenes.
arXiv Detail & Related papers (2023-03-23T13:58:11Z) - Delicate Textured Mesh Recovery from NeRF via Adaptive Surface
Refinement [78.48648360358193]
We present a novel framework that generates textured surface meshes from images.
Our approach begins by efficiently initializing the geometry and view-dependency appearance with a NeRF.
We jointly refine the appearance with geometry and bake it into texture images for real-time rendering.
arXiv Detail & Related papers (2023-03-03T17:14:44Z) - 3D Neural Sculpting (3DNS): Editing Neural Signed Distance Functions [34.39282814876276]
In this work, we propose the first method for efficient interactive editing of signed distance functions expressed through neural networks.
Inspired by 3D sculpting software for meshes, we use a brush-based framework that is intuitive and can in the future be used by sculptors and digital artists.
arXiv Detail & Related papers (2022-09-28T10:05:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.