ExtrudeNet: Unsupervised Inverse Sketch-and-Extrude for Shape Parsing
- URL: http://arxiv.org/abs/2209.15632v1
- Date: Fri, 30 Sep 2022 17:58:11 GMT
- Title: ExtrudeNet: Unsupervised Inverse Sketch-and-Extrude for Shape Parsing
- Authors: Daxuan Ren, Jianmin Zheng, Jianfei Cai, Jiatong Li, and Junzhe Zhang
- Abstract summary: This paper studies the problem of learning the shape given in the form of point clouds by inverse sketch-and-extrude.
We present ExtrudeNet, an unsupervised end-to-end network for discovering sketch and extrude from point clouds.
- Score: 46.778258706603005
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Sketch-and-extrude is a common and intuitive modeling process in computer
aided design. This paper studies the problem of learning the shape given in the
form of point clouds by inverse sketch-and-extrude. We present ExtrudeNet, an
unsupervised end-to-end network for discovering sketch and extrude from point
clouds. Behind ExtrudeNet are two new technical components: 1) an effective
representation for sketch and extrude, which can model extrusion with freeform
sketches and conventional cylinder and box primitives as well; and 2) a
numerical method for computing the signed distance field which is used in the
network learning. This is the first attempt that uses machine learning to
reverse engineer the sketch-and-extrude modeling process of a shape in an
unsupervised fashion. ExtrudeNet not only outputs a compact, editable and
interpretable representation of the shape that can be seamlessly integrated
into modern CAD software, but also aligns with the standard CAD modeling
process facilitating various editing applications, which distinguishes our work
from existing shape parsing research. Code is released at
https://github.com/kimren227/ExtrudeNet.
Related papers
- Freehand Sketch Generation from Mechanical Components [16.761960706420066]
MSFormer is first time to produce humanoid freehand sketches tailored for mechanical components.
First stage employs Open CASCADE technology to obtain multi-view contour sketches from mechanical components.
Second stage translates contour sketches into freehand sketches by a transformer-based generator.
arXiv Detail & Related papers (2024-08-12T07:44:19Z) - Doodle Your 3D: From Abstract Freehand Sketches to Precise 3D Shapes [118.406721663244]
We introduce a novel part-level modelling and alignment framework that facilitates abstraction modelling and cross-modal correspondence.
Our approach seamlessly extends to sketch modelling by establishing correspondence between CLIPasso edgemaps and projected 3D part regions.
arXiv Detail & Related papers (2023-12-07T05:04:33Z) - SENS: Part-Aware Sketch-based Implicit Neural Shape Modeling [124.3266213819203]
We present SENS, a novel method for generating and editing 3D models from hand-drawn sketches.
S SENS analyzes the sketch and encodes its parts into ViT patch encoding.
S SENS supports refinement via part reconstruction, allowing for nuanced adjustments and artifact removal.
arXiv Detail & Related papers (2023-06-09T17:50:53Z) - SECAD-Net: Self-Supervised CAD Reconstruction by Learning Sketch-Extrude
Operations [21.000539206470897]
SECAD-Net is an end-to-end neural network aimed at reconstructing compact and easy-to-edit CAD models.
We show superiority over state-of-the-art alternatives including the closely related method for supervised CAD reconstruction.
arXiv Detail & Related papers (2023-03-19T09:26:03Z) - Reconstructing editable prismatic CAD from rounded voxel models [16.03976415868563]
We introduce a novel neural network architecture to solve this challenging task.
Our method reconstructs the input geometry in the voxel space by decomposing the shape.
During inference, we obtain the CAD data by first searching a database of 2D constrained sketches.
arXiv Detail & Related papers (2022-09-02T16:44:10Z) - Sketch2PQ: Freeform Planar Quadrilateral Mesh Design via a Single Sketch [36.10997511325458]
We present a novel sketch-based system to bridge the concept design and digital modeling of freeform roof-like shapes.
Our system allows the user to sketch the surface boundary and contour lines under axonometric projection.
We propose a deep neural network to infer in real-time the underlying surface shape along with a dense conjugate direction field.
arXiv Detail & Related papers (2022-01-23T21:09:59Z) - Vitruvion: A Generative Model of Parametric CAD Sketches [22.65229769427499]
We present an approach to generative modeling of parametric CAD sketches.
Our model, trained on real-world designs from the SketchGraphs dataset, autoregressively synthesizes sketches as sequences of primitives.
We condition the model on various contexts, including partial sketches (primers) and images of hand-drawn sketches.
arXiv Detail & Related papers (2021-09-29T01:02:30Z) - HybridSDF: Combining Free Form Shapes and Geometric Primitives for
effective Shape Manipulation [58.411259332760935]
Deep-learning based 3D surface modeling has opened new shape design avenues.
These advances have not yet been accepted by the CAD community because they cannot be integrated into engineering.
We propose a novel approach to effectively combining geometric primitives and free-form surfaces represented by implicit surfaces for accurate modeling.
arXiv Detail & Related papers (2021-09-22T14:45:19Z) - Sketch2Mesh: Reconstructing and Editing 3D Shapes from Sketches [65.96417928860039]
We use an encoder/decoder architecture for the sketch to mesh translation.
We will show that this approach is easy to deploy, robust to style changes, and effective.
arXiv Detail & Related papers (2021-04-01T14:10:59Z) - Deep Plastic Surgery: Robust and Controllable Image Editing with
Human-Drawn Sketches [133.01690754567252]
Sketch-based image editing aims to synthesize and modify photos based on the structural information provided by the human-drawn sketches.
Deep Plastic Surgery is a novel, robust and controllable image editing framework that allows users to interactively edit images using hand-drawn sketch inputs.
arXiv Detail & Related papers (2020-01-09T08:57:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.