DTGBrepGen: A Novel B-rep Generative Model through Decoupling Topology and Geometry
- URL: http://arxiv.org/abs/2503.13110v1
- Date: Mon, 17 Mar 2025 12:34:14 GMT
- Title: DTGBrepGen: A Novel B-rep Generative Model through Decoupling Topology and Geometry
- Authors: Jing Li, Yihang Fu, Falai Chen,
- Abstract summary: Boundary representation (B-rep) of geometric models is a fundamental format in Computer-Aided Design (CAD)<n>We propose DTGBrepGen, a novel topology-geometry decoupled framework for B-rep generation.
- Score: 3.859930277034918
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Boundary representation (B-rep) of geometric models is a fundamental format in Computer-Aided Design (CAD). However, automatically generating valid and high-quality B-rep models remains challenging due to the complex interdependence between the topology and geometry of the models. Existing methods tend to prioritize geometric representation while giving insufficient attention to topological constraints, making it difficult to maintain structural validity and geometric accuracy. In this paper, we propose DTGBrepGen, a novel topology-geometry decoupled framework for B-rep generation that explicitly addresses both aspects. Our approach first generates valid topological structures through a two-stage process that independently models edge-face and edge-vertex adjacency relationships. Subsequently, we employ Transformer-based diffusion models for sequential geometry generation, progressively generating vertex coordinates, followed by edge geometries and face geometries which are represented as B-splines. Extensive experiments on diverse CAD datasets show that DTGBrepGen significantly outperforms existing methods in both topological validity and geometric accuracy, achieving higher validity rates and producing more diverse and realistic B-reps. Our code is publicly available at https://github.com/jinli99/DTGBrepGen.
Related papers
- HoLa: B-Rep Generation using a Holistic Latent Representation [51.07878285790399]
We introduce a novel representation for learning and generating Computer-Aided Design (CAD) models in the form of $textitboundary representations$ (B-Reps)
Our representation unifies the continuous geometric properties of B-Rep primitives in different orders.
Our method significantly reduces ambiguities, redundancies, and incoherences among the generated B-Rep primitives.
arXiv Detail & Related papers (2025-04-19T10:34:24Z) - BRepFormer: Transformer-Based B-rep Geometric Feature Recognition [14.01667117252404]
Recognizing geometric features on B-rep models is a cornerstone technique for multimedia content-based retrieval.
We propose BRepFormer, a novel transformer-based model to recognize both machining feature and complex CAD models' features.
BRepFormer achieves state-of-the-art accuracy on the MFInstSeg, MFTRCAD, and our CBF datasets.
arXiv Detail & Related papers (2025-04-10T01:36:06Z) - Boundary representation learning via Transformer [0.6906005491572401]
This paper introduces Boundary Representation Transformer (BRT), a novel method adapting Transformer for B-rep learning.
BRT achieves state-of-the-art performance in part classification and feature recognition tasks.
arXiv Detail & Related papers (2025-04-07T07:04:02Z) - R-CoT: Reverse Chain-of-Thought Problem Generation for Geometric Reasoning in Large Multimodal Models [86.06825304372613]
We propose a two-stage Reverse Chain-of-Thought (R-CoT) geometry problem generation pipeline.
First, we introduce GeoChain to produce high-fidelity geometric images and corresponding descriptions.
We then design a Reverse A&Q method that reasons step-by-step based on the descriptions and generates questions in reverse from the reasoning results.
arXiv Detail & Related papers (2024-10-23T13:58:39Z) - A Survey of Geometric Graph Neural Networks: Data Structures, Models and Applications [71.809127869349]
This paper formalizes geometric graph as the data structure, on top of which we provide a unified view of existing models from the geometric message passing perspective.<n>We also summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation.
arXiv Detail & Related papers (2024-03-01T12:13:04Z) - Adaptive Surface Normal Constraint for Geometric Estimation from Monocular Images [56.86175251327466]
We introduce a novel approach to learn geometries such as depth and surface normal from images while incorporating geometric context.
Our approach extracts geometric context that encodes the geometric variations present in the input image and correlates depth estimation with geometric constraints.
Our method unifies depth and surface normal estimations within a cohesive framework, which enables the generation of high-quality 3D geometry from images.
arXiv Detail & Related papers (2024-02-08T17:57:59Z) - BrepGen: A B-rep Generative Diffusion Model with Structured Latent Geometry [24.779824909395245]
BrepGen is a diffusion-based generative approach that directly outputs a Boundary representation (Brep) Computer-Aided Design (CAD) model.
BrepGen represents a B-rep model as a novel structured latent geometry in a hierarchical tree.
arXiv Detail & Related papers (2024-01-28T04:07:59Z) - SolidGen: An Autoregressive Model for Direct B-rep Synthesis [15.599363091502365]
Boundary representation (B-rep) format is de-facto shape representation in computer-aided design (CAD)
Recent approaches to generating CAD models have focused on learning sketch-and-extrude modeling sequences.
We present a new approach that enables learning from and synthesizing B-reps without the need for supervision.
arXiv Detail & Related papers (2022-03-26T00:00:45Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - DSG-Net: Learning Disentangled Structure and Geometry for 3D Shape
Generation [98.96086261213578]
We introduce DSG-Net, a deep neural network that learns a disentangled structured and geometric mesh representation for 3D shapes.
This supports a range of novel shape generation applications with disentangled control, such as of structure (geometry) while keeping geometry (structure) unchanged.
Our method not only supports controllable generation applications but also produces high-quality synthesized shapes, outperforming state-of-the-art methods.
arXiv Detail & Related papers (2020-08-12T17:06:51Z) - UV-Net: Learning from Boundary Representations [17.47054752280569]
We introduce UV-Net, a novel neural network architecture and representation designed to operate directly on Boundary representation (B-rep) data from 3D CAD models.
B-rep data presents some unique challenges when used with modern machine learning due to the complexity of the data structure and its support for both continuous non-Euclidean geometric entities and discrete topological entities.
arXiv Detail & Related papers (2020-06-18T00:12:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.