A neural optimization framework for free-boundary diffeomorphic mapping problems and its applications
- URL: http://arxiv.org/abs/2511.11679v1
- Date: Wed, 12 Nov 2025 03:43:28 GMT
- Title: A neural optimization framework for free-boundary diffeomorphic mapping problems and its applications
- Authors: Zhehao Xu, Lok Ming Lui,
- Abstract summary: We propose a neural surrogate, the Spectral Beltrami Network (SBN), that embeds LSQC energy into a multiscale mesh-spectral architecture.<n>Next, we propose the SBN guided optimization framework SBN-Opt which optimize free-boundary diffeomorphism for the problem, with local geometric distortion explicitly controllable.
- Score: 0.42970700836450487
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Free-boundary diffeomorphism optimization is a core ingredient in the surface mapping problem but remains notoriously difficult because the boundary is unconstrained and local bijectivity must be preserved under large deformation. Numerical Least-Squares Quasiconformal (LSQC) theory, with its provable existence, uniqueness, similarity-invariance and resolution-independence, offers an elegant mathematical remedy. However, the conventional numerical algorithm requires landmark conditioning, and cannot be applied into gradient-based optimization. We propose a neural surrogate, the Spectral Beltrami Network (SBN), that embeds LSQC energy into a multiscale mesh-spectral architecture. Next, we propose the SBN guided optimization framework SBN-Opt which optimizes free-boundary diffeomorphism for the problem, with local geometric distortion explicitly controllable. Extensive experiments on density-equalizing maps and inconsistent surface registration demonstrate our SBN-Opt's superiority over traditional numerical algorithms.
Related papers
- Spectral Analysis of Hard-Constraint PINNs: The Spatial Modulation Mechanism of Boundary Functions [4.170072254495455]
This work reveals that the boundary function $B$ introduces a multiplicative spatial modulation that fundamentally alters the learning landscape.<n>A rigorous Neural Tangent Kernel (NTK) framework for HC-PINNs is established, deriving the explicit kernel composition law.<n>It is shown that widely used boundary functions can inadvertently induce spectral collapse, leading to optimization stagnation despite exact boundary satisfaction.
arXiv Detail & Related papers (2025-12-29T08:31:58Z) - Scalable Quantum Walk-Based Heuristics for the Minimum Vertex Cover Problem [0.0]
We propose a novel quantum algorithm for the Minimum Vertex Cover (MVC) problem based on continuous-time quantum walks (CTQWs)<n>In this framework, the coherent propagation of a quantum walker over a graph encodes its structural properties into state amplitudes.<n>We show that the CTQW-based algorithm consistently achieves superior approximation ratios and exhibits remarkable robustness with respect to network topology.
arXiv Detail & Related papers (2025-12-02T17:04:57Z) - Graph Neural Regularizers for PDE Inverse Problems [62.49743146797144]
We present a framework for solving a broad class of ill-posed inverse problems governed by partial differential equations (PDEs)<n>The forward problem is numerically solved using the finite element method (FEM)<n>We employ physics-inspired graph neural networks as learned regularizers, providing a robust, interpretable, and generalizable alternative to standard approaches.
arXiv Detail & Related papers (2025-10-23T21:43:25Z) - Quantum-Classical Hybrid Quantized Neural Network [8.382617481718643]
We present a novel Quadratic Binary Optimization (QBO) model for quantized neural network training, enabling the use of arbitrary activation and loss functions.<n>We employ the Quantum Gradient Conditional Descent (QCGD) algorithm, which leverages quantum computing to directly solve the QCBO problem.
arXiv Detail & Related papers (2025-06-23T02:12:36Z) - Cons-training Tensor Networks: Embedding and Optimization Over Discrete Linear Constraints [2.8834278113855896]
We introduce a novel family of tensor networks, termed constrained matrix product states (MPS)<n>MPS incorporate exactly arbitrary discrete linear constraints, including inequalities, into sparse block structures.<n>These networks are particularly tailored for modeling distributions with support strictly over the feasible space.
arXiv Detail & Related papers (2024-05-15T00:13:18Z) - GloptiNets: Scalable Non-Convex Optimization with Certificates [61.50835040805378]
We present a novel approach to non-cube optimization with certificates, which handles smooth functions on the hypercube or on the torus.
By exploiting the regularity of the target function intrinsic in the decay of its spectrum, we allow at the same time to obtain precise certificates and leverage the advanced and powerful neural networks.
arXiv Detail & Related papers (2023-06-26T09:42:59Z) - Physics and Equality Constrained Artificial Neural Networks: Application
to Partial Differential Equations [1.370633147306388]
Physics-informed neural networks (PINNs) have been proposed to learn the solution of partial differential equations (PDE)
Here, we show that this specific way of formulating the objective function is the source of severe limitations in the PINN approach.
We propose a versatile framework that can tackle both inverse and forward problems.
arXiv Detail & Related papers (2021-09-30T05:55:35Z) - Quadratic Unconstrained Binary Optimisation via Quantum-Inspired
Annealing [58.720142291102135]
We present a classical algorithm to find approximate solutions to instances of quadratic unconstrained binary optimisation.
We benchmark our approach for large scale problem instances with tuneable hardness and planted solutions.
arXiv Detail & Related papers (2021-08-18T09:26:17Z) - Fixed Depth Hamiltonian Simulation via Cartan Decomposition [59.20417091220753]
We present a constructive algorithm for generating quantum circuits with time-independent depth.
We highlight our algorithm for special classes of models, including Anderson localization in one dimensional transverse field XY model.
In addition to providing exact circuits for a broad set of spin and fermionic models, our algorithm provides broad analytic and numerical insight into optimal Hamiltonian simulations.
arXiv Detail & Related papers (2021-04-01T19:06:00Z) - Demystifying Batch Normalization in ReLU Networks: Equivalent Convex
Optimization Models and Implicit Regularization [29.411334761836958]
We introduce an analytic framework based convex duality to obtain exact convex representations of weight-decay regularized ReLU networks with BN.
Our analyses also show that optimal layer weights can be obtained as simple closed-form formulas in the high-dimensional and/or CIFized regimes.
arXiv Detail & Related papers (2021-03-02T06:36:31Z) - Physics-informed neural networks with hard constraints for inverse
design [3.8191831921441337]
We propose a new deep learning method -- physics-informed neural networks with hard constraints (hPINNs) -- for solving topology optimization.
We demonstrate the effectiveness of hPINN for a holography problem in optics and a fluid problem of Stokes flow.
arXiv Detail & Related papers (2021-02-09T03:18:15Z) - Efficient Methods for Structured Nonconvex-Nonconcave Min-Max
Optimization [98.0595480384208]
We propose a generalization extraient spaces which converges to a stationary point.
The algorithm applies not only to general $p$-normed spaces, but also to general $p$-dimensional vector spaces.
arXiv Detail & Related papers (2020-10-31T21:35:42Z) - Cogradient Descent for Bilinear Optimization [124.45816011848096]
We introduce a Cogradient Descent algorithm (CoGD) to address the bilinear problem.
We solve one variable by considering its coupling relationship with the other, leading to a synchronous gradient descent.
Our algorithm is applied to solve problems with one variable under the sparsity constraint.
arXiv Detail & Related papers (2020-06-16T13:41:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.