Feature-Preserving Mesh Decimation for Normal Integration
- URL: http://arxiv.org/abs/2504.00867v1
- Date: Tue, 01 Apr 2025 14:56:47 GMT
- Title: Feature-Preserving Mesh Decimation for Normal Integration
- Authors: Moritz Heep, Sven Behnke, Eduard Zell,
- Abstract summary: Normal integration reconstructs 3D surfaces from normal maps obtained by photometric stereo.<n>In this work, we replace the dense pixel grid with a sparse anisotropic triangle mesh prior to normal integration.
- Score: 16.320467417627277
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normal integration reconstructs 3D surfaces from normal maps obtained e.g. by photometric stereo. These normal maps capture surface details down to the pixel level but require large computational resources for integration at high resolutions. In this work, we replace the dense pixel grid with a sparse anisotropic triangle mesh prior to normal integration. We adapt the triangle mesh to the local geometry in the case of complex surface structures and remove oversampling from flat featureless regions. For high-resolution images, the resulting compression reduces normal integration runtimes from hours to minutes while maintaining high surface accuracy. Our main contribution is the derivation of the well-known quadric error measure from mesh decimation for screen space applications and its combination with optimal Delaunay triangulation.
Related papers
- SparseFlex: High-Resolution and Arbitrary-Topology 3D Shape Modeling [79.56581753856452]
SparseFlex is a novel sparse-structured isosurface representation that enables differentiable mesh reconstruction at resolutions up to $10243$ directly from rendering losses.<n>By enabling high-resolution, differentiable mesh reconstruction and generation with rendering losses, SparseFlex significantly advances the state-of-the-art in 3D shape representation and modeling.
arXiv Detail & Related papers (2025-03-27T17:46:42Z) - GeoSplatting: Towards Geometry Guided Gaussian Splatting for Physically-based Inverse Rendering [69.67264955234494]
GeoSplatting is a novel hybrid representation that augments 3DGS with explicit geometric guidance and differentiable PBR equations.
Comprehensive evaluations across diverse datasets demonstrate the superiority of GeoSplatting.
arXiv Detail & Related papers (2024-10-31T17:57:07Z) - An Adaptive Screen-Space Meshing Approach for Normal Integration [0.0]
This work introduces an adaptive surface triangulation in the image domain and afterwards performs the normal integration on a triangle mesh.
Based on the curvature, we identify flat areas and aggregate pixels into triangles.
Compared to pixel grids, our triangle meshes adapt locally to surface details and allow for a sparser representation.
arXiv Detail & Related papers (2024-09-25T13:12:58Z) - PGSR: Planar-based Gaussian Splatting for Efficient and High-Fidelity Surface Reconstruction [37.14913599050765]
We propose a fast planar-based Gaussian splatting reconstruction representation (PGSR) to achieve high-fidelity surface reconstruction.
We then introduce single-view geometric, multi-view photometric, and geometric regularization to preserve global geometric accuracy.
Our method achieves fast training and rendering while maintaining high-fidelity rendering and geometric reconstruction, outperforming 3DGS-based and NeRF-based methods.
arXiv Detail & Related papers (2024-06-10T17:59:01Z) - High-quality Surface Reconstruction using Gaussian Surfels [18.51978059665113]
We propose a novel point-based representation, Gaussian surfels, to combine the advantages of the flexible optimization procedure in 3D Gaussian points.
This is achieved by setting the z-scale of 3D Gaussian points to 0, effectively flattening the original 3D ellipsoid into a 2D ellipse.
By treating the local z-axis as the normal direction, it greatly improves optimization stability and surface alignment.
arXiv Detail & Related papers (2024-04-27T04:13:39Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - PRS: Sharp Feature Priors for Resolution-Free Surface Remeshing [30.28380889862059]
We present a data-driven approach for automatic feature detection and remeshing.
Our algorithm improves over state-of-the-art by 26% normals F-score and 42% perceptual $textRMSE_textv$.
arXiv Detail & Related papers (2023-11-30T12:15:45Z) - GeoNet++: Iterative Geometric Neural Network with Edge-Aware Refinement
for Joint Depth and Surface Normal Estimation [204.13451624763735]
We propose a geometric neural network with edge-aware refinement (GeoNet++) to jointly predict both depth and surface normal maps from a single image.
GeoNet++ effectively predicts depth and surface normals with strong 3D consistency and sharp boundaries.
In contrast to current metrics that focus on evaluating pixel-wise error/accuracy, 3DGM measures whether the predicted depth can reconstruct high-quality 3D surface normals.
arXiv Detail & Related papers (2020-12-13T06:48:01Z) - Deep Active Surface Models [60.027353171412216]
Active Surface Models have a long history of being useful to model complex 3D surfaces but only Active Contours have been used in conjunction with deep networks.
We introduce layers that implement them that can be integrated seamlessly into Graph Convolutional Networks to enforce sophisticated smoothness priors.
arXiv Detail & Related papers (2020-11-17T18:48:28Z) - PUGeo-Net: A Geometry-centric Network for 3D Point Cloud Upsampling [103.09504572409449]
We propose a novel deep neural network based method, called PUGeo-Net, to generate uniform dense point clouds.
Thanks to its geometry-centric nature, PUGeo-Net works well for both CAD models with sharp features and scanned models with rich geometric details.
arXiv Detail & Related papers (2020-02-24T14:13:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.