Neural Octahedral Field: Octahedral prior for simultaneous smoothing and sharp edge regularization
- URL: http://arxiv.org/abs/2408.00303v1
- Date: Thu, 1 Aug 2024 06:02:59 GMT
- Title: Neural Octahedral Field: Octahedral prior for simultaneous smoothing and sharp edge regularization
- Authors: Ruichen Zheng, Tao Yu,
- Abstract summary: We propose to guide surface reconstruction under a new variant of neural field, the octahedral field.
By simultaneously fitting and smoothing the octahedral field alongside the implicit geometry, it behaves analogously to bilateral filtering.
Our method outperforms various traditional and neural approaches across extensive experiments.
- Score: 9.167571374234166
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural implicit representation, the parameterization of distance function as a coordinate neural field, has emerged as a promising lead in tackling surface reconstruction from unoriented point clouds. To enforce consistent orientation, existing methods focus on regularizing the gradient of the distance function, such as constraining it to be of the unit norm, minimizing its divergence, or aligning it with the eigenvector of Hessian that corresponds to zero eigenvalue. However, under the presence of large scanning noise, they tend to either overfit the noise input or produce an excessively smooth reconstruction. In this work, we propose to guide the surface reconstruction under a new variant of neural field, the octahedral field, leveraging the spherical harmonics representation of octahedral frames originated in the hexahedral meshing. Such field automatically snaps to geometry features when constrained to be smooth, and naturally preserves sharp angles when interpolated over creases. By simultaneously fitting and smoothing the octahedral field alongside the implicit geometry, it behaves analogously to bilateral filtering, resulting in smooth reconstruction while preserving sharp edges. Despite being operated purely pointwise, our method outperforms various traditional and neural approaches across extensive experiments, and is very competitive with methods that require normal and data priors. Our full implementation is available at: https://github.com/Ankbzpx/frame-field.
Related papers
- ND-SDF: Learning Normal Deflection Fields for High-Fidelity Indoor Reconstruction [50.07671826433922]
It is non-trivial to simultaneously recover meticulous geometry and preserve smoothness across regions with differing characteristics.
We propose ND-SDF, which learns a Normal Deflection field to represent the angular deviation between the scene normal and the prior normal.
Our method not only obtains smooth weakly textured regions such as walls and floors but also preserves the geometric details of complex structures.
arXiv Detail & Related papers (2024-08-22T17:59:01Z) - Stable Nonconvex-Nonconcave Training via Linear Interpolation [51.668052890249726]
This paper presents a theoretical analysis of linearahead as a principled method for stabilizing (large-scale) neural network training.
We argue that instabilities in the optimization process are often caused by the nonmonotonicity of the loss landscape and show how linear can help by leveraging the theory of nonexpansive operators.
arXiv Detail & Related papers (2023-10-20T12:45:12Z) - NeuRBF: A Neural Fields Representation with Adaptive Radial Basis
Functions [93.02515761070201]
We present a novel type of neural fields that uses general radial bases for signal representation.
Our method builds upon general radial bases with flexible kernel position and shape, which have higher spatial adaptivity and can more closely fit target signals.
When applied to neural radiance field reconstruction, our method achieves state-of-the-art rendering quality, with small model size and comparable training speed.
arXiv Detail & Related papers (2023-09-27T06:32:05Z) - Neural-Singular-Hessian: Implicit Neural Representation of Unoriented
Point Clouds by Enforcing Singular Hessian [44.28251558359345]
We propose a new approach for reconstructing surfaces from point clouds.
Our technique aligns the gradients for a near-surface point and its on-surface projection point, producing a rough but faithful shape within just a few iterations.
arXiv Detail & Related papers (2023-09-04T20:10:38Z) - Explicit Neural Surfaces: Learning Continuous Geometry With Deformation
Fields [33.38609930708073]
We introduce Explicit Neural Surfaces (ENS), an efficient smooth surface representation that encodes topology with a deformation field from a known base domain.
Compared to implicit surfaces, ENS trains faster and has several orders of magnitude faster inference times.
arXiv Detail & Related papers (2023-06-05T15:24:33Z) - Machine learning algorithms for three-dimensional mean-curvature
computation in the level-set method [0.0]
We propose a data-driven mean-curvature solver for the level-set method.
Our proposed system can yield more accurate mean-curvature estimations than modern particle-based interface reconstruction.
arXiv Detail & Related papers (2022-08-18T20:19:22Z) - Critical Regularizations for Neural Surface Reconstruction in the Wild [26.460011241432092]
We present RegSDF, which shows that proper point cloud supervisions and geometry regularizations are sufficient to produce high-quality and robust reconstruction results.
RegSDF is able to reconstruct surfaces with fine details even for open scenes with complex topologies and unstructured camera trajectories.
arXiv Detail & Related papers (2022-06-07T08:11:22Z) - Learning Smooth Neural Functions via Lipschitz Regularization [92.42667575719048]
We introduce a novel regularization designed to encourage smooth latent spaces in neural fields.
Compared with prior Lipschitz regularized networks, ours is computationally fast and can be implemented in four lines of code.
arXiv Detail & Related papers (2022-02-16T21:24:54Z) - Spatio-Temporal Variational Gaussian Processes [26.60276485130467]
We introduce a scalable approach to Gaussian process inference that combinestemporal-temporal filtering with natural variational inference.
We derive a sparse approximation that constructs a state-space model over a reduced set of inducing points.
We show that for separable Markov kernels the full sparse cases recover exactly the standard variational GP.
arXiv Detail & Related papers (2021-11-02T16:53:31Z) - Point2Mesh: A Self-Prior for Deformable Meshes [83.31236364265403]
We introduce Point2Mesh, a technique for reconstructing a surface mesh from an input point cloud.
The self-prior encapsulates reoccurring geometric repetitions from a single shape within the weights of a deep neural network.
We show that Point2Mesh converges to a desirable solution; compared to a prescribed smoothness prior, which often becomes trapped in undesirable local minima.
arXiv Detail & Related papers (2020-05-22T10:01:04Z) - Neural Subdivision [58.97214948753937]
This paper introduces Neural Subdivision, a novel framework for data-driven coarseto-fine geometry modeling.
We optimize for the same set of network weights across all local mesh patches, thus providing an architecture that is not constrained to a specific input mesh, fixed genus, or category.
We demonstrate that even when trained on a single high-resolution mesh our method generates reasonable subdivisions for novel shapes.
arXiv Detail & Related papers (2020-05-04T20:03:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.