Diffeomorphic Mesh Deformation via Efficient Optimal Transport for Cortical Surface Reconstruction
- URL: http://arxiv.org/abs/2305.17555v3
- Date: Mon, 18 Mar 2024 04:34:27 GMT
- Title: Diffeomorphic Mesh Deformation via Efficient Optimal Transport for Cortical Surface Reconstruction
- Authors: Tung Le, Khai Nguyen, Shanlin Sun, Kun Han, Nhat Ho, Xiaohui Xie,
- Abstract summary: Mesh deformation plays a pivotal role in many 3D vision tasks including dynamic simulations, rendering, and reconstruction.
A prevalent approach in current deep learning is the set-based approach which measures the discrepancy between two surfaces by comparing two randomly sampled point-clouds from the two meshes with Chamfer pseudo-distance.
We propose a novel metric for learning mesh deformation, defined by sliced Wasserstein distance on meshes represented as probability measures that generalize the set-based approach.
- Score: 40.73187749820041
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mesh deformation plays a pivotal role in many 3D vision tasks including dynamic simulations, rendering, and reconstruction. However, defining an efficient discrepancy between predicted and target meshes remains an open problem. A prevalent approach in current deep learning is the set-based approach which measures the discrepancy between two surfaces by comparing two randomly sampled point-clouds from the two meshes with Chamfer pseudo-distance. Nevertheless, the set-based approach still has limitations such as lacking a theoretical guarantee for choosing the number of points in sampled point-clouds, and the pseudo-metricity and the quadratic complexity of the Chamfer divergence. To address these issues, we propose a novel metric for learning mesh deformation. The metric is defined by sliced Wasserstein distance on meshes represented as probability measures that generalize the set-based approach. By leveraging probability measure space, we gain flexibility in encoding meshes using diverse forms of probability measures, such as continuous, empirical, and discrete measures via varifold representation. After having encoded probability measures, we can compare meshes by using the sliced Wasserstein distance which is an effective optimal transport distance with linear computational complexity and can provide a fast statistical rate for approximating the surface of meshes. To the end, we employ a neural ordinary differential equation (ODE) to deform the input surface into the target shape by modeling the trajectories of the points on the surface. Our experiments on cortical surface reconstruction demonstrate that our approach surpasses other competing methods in multiple datasets and metrics.
Related papers
- Sample-Efficient Geometry Reconstruction from Euclidean Distances using Non-Convex Optimization [7.114174944371803]
The problem of finding suitable point embedding Euclidean distance information point pairs arises both as a core task and as a sub-machine learning learning problem.
In this paper, we aim to solve this problem given a minimal number of samples.
arXiv Detail & Related papers (2024-10-22T13:02:12Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Point Cloud Classification via Deep Set Linearized Optimal Transport [51.99765487172328]
We introduce Deep Set Linearized Optimal Transport, an algorithm designed for the efficient simultaneous embedding of point clouds into an $L2-$space.
This embedding preserves specific low-dimensional structures within the Wasserstein space while constructing a classifier to distinguish between various classes of point clouds.
We showcase the advantages of our algorithm over the standard deep set approach through experiments on a flow dataset with a limited number of labeled point clouds.
arXiv Detail & Related papers (2024-01-02T23:26:33Z) - Elastic shape analysis of surfaces with second-order Sobolev metrics: a
comprehensive numerical framework [11.523323270411959]
This paper introduces a set of numerical methods for shape analysis of 3D surfaces.
We address the computation of geodesics and geodesic distances between parametrized or unparametrized surfaces represented as 3D meshes.
arXiv Detail & Related papers (2022-04-08T18:19:05Z) - Semi-Discrete Normalizing Flows through Differentiable Tessellation [31.474420819149724]
We propose a tessellation-based approach that learns quantization boundaries on a continuous space, complete with exact likelihood evaluations.
This is done through constructing normalizing flows on convex polytopes parameterized through a differentiable Voronoi tessellation.
We show improvements over existing methods across a range of structured data modalities, and find that we can achieve a significant gain from just adding Voronoi mixtures to a baseline model.
arXiv Detail & Related papers (2022-03-14T03:06:31Z) - A Model for Multi-View Residual Covariances based on Perspective
Deformation [88.21738020902411]
We derive a model for the covariance of the visual residuals in multi-view SfM, odometry and SLAM setups.
We validate our model with synthetic and real data and integrate it into photometric and feature-based Bundle Adjustment.
arXiv Detail & Related papers (2022-02-01T21:21:56Z) - Manifold learning-based polynomial chaos expansions for high-dimensional
surrogate models [0.0]
We introduce a manifold learning-based method for uncertainty quantification (UQ) in describing systems.
The proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
arXiv Detail & Related papers (2021-07-21T00:24:15Z) - MongeNet: Efficient Sampler for Geometric Deep Learning [17.369783838267942]
MongeNet is a fast and optimal transport based sampler that allows for an accurate discretization of a mesh with better approximation properties.
We compare our method to the ubiquitous random uniform sampling and show that the approximation error is almost half with a very small computational overhead.
arXiv Detail & Related papers (2021-04-29T17:59:01Z) - Deep Magnification-Flexible Upsampling over 3D Point Clouds [103.09504572409449]
We propose a novel end-to-end learning-based framework to generate dense point clouds.
We first formulate the problem explicitly, which boils down to determining the weights and high-order approximation errors.
Then, we design a lightweight neural network to adaptively learn unified and sorted weights as well as the high-order refinements.
arXiv Detail & Related papers (2020-11-25T14:00:18Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.