TreeSketchNet: From Sketch To 3D Tree Parameters Generation
- URL: http://arxiv.org/abs/2207.12297v1
- Date: Mon, 25 Jul 2022 16:08:05 GMT
- Title: TreeSketchNet: From Sketch To 3D Tree Parameters Generation
- Authors: Gilda Manfredi, Nicola Capece, Ugo Erra, and Monica Gruosso
- Abstract summary: 3D modeling of non-linear objects from stylized sketches is a challenge even for experts in computer graphics.
We propose a broker system that mediates between the modeler and the 3D modelling software.
- Score: 4.234843176066354
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: 3D modeling of non-linear objects from stylized sketches is a challenge even
for experts in computer graphics. The extrapolation of objects parameters from
a stylized sketch is a very complex and cumbersome task. In the present study,
we propose a broker system that mediates between the modeler and the 3D
modelling software and can transform a stylized sketch of a tree into a
complete 3D model. The input sketches do not need to be accurate or detailed,
and only need to represent a rudimentary outline of the tree that the modeler
wishes to 3D-model. Our approach is based on a well-defined Deep Neural Network
(DNN) architecture, we called TreeSketchNet (TSN), based on convolutions and
able to generate Weber and Penn parameters that can be interpreted by the
modelling software to generate a 3D model of a tree starting from a simple
sketch. The training dataset consists of synthetically-generated sketches that
are associated with Weber-Penn parameters generated by a dedicated Blender
modelling software add-on. The accuracy of the proposed method is demonstrated
by testing the TSN with both synthetic and hand-made sketches. Finally, we
provide a qualitative analysis of our results, by evaluating the coherence of
the predicted parameters with several distinguishing features.
Related papers
- Sketch3D: Style-Consistent Guidance for Sketch-to-3D Generation [55.73399465968594]
This paper proposes a novel generation paradigm Sketch3D to generate realistic 3D assets with shape aligned with the input sketch and color matching the textual description.
Three strategies are designed to optimize 3D Gaussians, i.e., structural optimization via a distribution transfer mechanism, color optimization with a straightforward MSE loss and sketch similarity optimization with a CLIP-based geometric similarity loss.
arXiv Detail & Related papers (2024-04-02T11:03:24Z) - Doodle Your 3D: From Abstract Freehand Sketches to Precise 3D Shapes [118.406721663244]
We introduce a novel part-level modelling and alignment framework that facilitates abstraction modelling and cross-modal correspondence.
Our approach seamlessly extends to sketch modelling by establishing correspondence between CLIPasso edgemaps and projected 3D part regions.
arXiv Detail & Related papers (2023-12-07T05:04:33Z) - SENS: Part-Aware Sketch-based Implicit Neural Shape Modeling [124.3266213819203]
We present SENS, a novel method for generating and editing 3D models from hand-drawn sketches.
S SENS analyzes the sketch and encodes its parts into ViT patch encoding.
S SENS supports refinement via part reconstruction, allowing for nuanced adjustments and artifact removal.
arXiv Detail & Related papers (2023-06-09T17:50:53Z) - Sketch2Cloth: Sketch-based 3D Garment Generation with Unsigned Distance
Fields [12.013968508918634]
We propose Sketch2Cloth, a sketch-based 3D garment generation system using the unsigned distance fields from the user's sketch input.
Sketch2Cloth first estimates the unsigned distance function of the target 3D model from the sketch input, and extracts the mesh from the estimated field with Marching Cubes.
We also provide the model editing function to modify the generated mesh.
arXiv Detail & Related papers (2023-03-01T01:45:28Z) - Geometric Understanding of Sketches [0.0]
I explore two methods that help a system provide a geometric machine-understanding of sketches, and in-turn help a user accomplish a downstream task.
The first work deals with interpretation of a 2D-line drawing as a graph structure, and also illustrates its effectiveness through its physical reconstruction by a robot.
In the second work, we test the 3D-geometric understanding of a sketch-based system without explicit access to the information about 3D-geometry.
arXiv Detail & Related papers (2022-04-13T23:55:51Z) - SingleSketch2Mesh : Generating 3D Mesh model from Sketch [1.6973426830397942]
Current methods to generate 3D models from sketches are either manual or tightly coupled with 3D modeling platforms.
We propose a novel AI based ensemble approach, SingleSketch2Mesh, for generating 3D models from hand-drawn sketches.
arXiv Detail & Related papers (2022-03-07T06:30:36Z) - Sketch2PQ: Freeform Planar Quadrilateral Mesh Design via a Single Sketch [36.10997511325458]
We present a novel sketch-based system to bridge the concept design and digital modeling of freeform roof-like shapes.
Our system allows the user to sketch the surface boundary and contour lines under axonometric projection.
We propose a deep neural network to infer in real-time the underlying surface shape along with a dense conjugate direction field.
arXiv Detail & Related papers (2022-01-23T21:09:59Z) - Interactive Annotation of 3D Object Geometry using 2D Scribbles [84.51514043814066]
In this paper, we propose an interactive framework for annotating 3D object geometry from point cloud data and RGB imagery.
Our framework targets naive users without artistic or graphics expertise.
arXiv Detail & Related papers (2020-08-24T21:51:29Z) - Combining Implicit Function Learning and Parametric Models for 3D Human
Reconstruction [123.62341095156611]
Implicit functions represented as deep learning approximations are powerful for reconstructing 3D surfaces.
Such features are essential in building flexible models for both computer graphics and computer vision.
We present methodology that combines detail-rich implicit functions and parametric representations.
arXiv Detail & Related papers (2020-07-22T13:46:14Z) - B\'ezierSketch: A generative model for scalable vector sketches [132.5223191478268]
We present B'ezierSketch, a novel generative model for fully vector sketches that are automatically scalable and high-resolution.
We first introduce a novel inverse graphics approach to stroke embedding that trains an encoder to embed each stroke to its best fit B'ezier curve.
This enables us to treat sketches as short sequences of paramaterized strokes and thus train a recurrent sketch generator with greater capacity for longer sketches.
arXiv Detail & Related papers (2020-07-04T21:30:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.