ResNet-LDDMM: Advancing the LDDMM Framework Using Deep Residual Networks
- URL: http://arxiv.org/abs/2102.07951v1
- Date: Tue, 16 Feb 2021 04:07:13 GMT
- Title: ResNet-LDDMM: Advancing the LDDMM Framework Using Deep Residual Networks
- Authors: Boulbaba Ben Amor, Sylvain Arguill\`ere and Ling Shao
- Abstract summary: In this work, we make use of deep residual neural networks to solve the non-stationary ODE (flow equation) based on a Euler's discretization scheme.
We illustrate these ideas on diverse registration problems of 3D shapes under complex topology-preserving transformations.
- Score: 86.37110868126548
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In deformable registration, the geometric framework - large deformation
diffeomorphic metric mapping or LDDMM, in short - has inspired numerous
techniques for comparing, deforming, averaging and analyzing shapes or images.
Grounded in flows, which are akin to the equations of motion used in fluid
dynamics, LDDMM algorithms solve the flow equation in the space of plausible
deformations, i.e. diffeomorphisms. In this work, we make use of deep residual
neural networks to solve the non-stationary ODE (flow equation) based on a
Euler's discretization scheme. The central idea is to represent time-dependent
velocity fields as fully connected ReLU neural networks (building blocks) and
derive optimal weights by minimizing a regularized loss function. Computing
minimizing paths between deformations, thus between shapes, turns to find
optimal network parameters by back-propagating over the intermediate building
blocks. Geometrically, at each time step, ResNet-LDDMM searches for an optimal
partition of the space into multiple polytopes, and then computes optimal
velocity vectors as affine transformations on each of these polytopes. As a
result, different parts of the shape, even if they are close (such as two
fingers of a hand), can be made to belong to different polytopes, and therefore
be moved in different directions without costing too much energy. Importantly,
we show how diffeomorphic transformations, or more precisely bilipshitz
transformations, are predicted by our algorithm. We illustrate these ideas on
diverse registration problems of 3D shapes under complex topology-preserving
transformations. We thus provide essential foundations for more advanced shape
variability analysis under a novel joint geometric-neural networks
Riemannian-like framework, i.e. ResNet-LDDMM.
Related papers
- Transolver: A Fast Transformer Solver for PDEs on General Geometries [66.82060415622871]
We present Transolver, which learns intrinsic physical states hidden behind discretized geometries.
By calculating attention to physics-aware tokens encoded from slices, Transovler can effectively capture intricate physical correlations.
Transolver achieves consistent state-of-the-art with 22% relative gain across six standard benchmarks and also excels in large-scale industrial simulations.
arXiv Detail & Related papers (2024-02-04T06:37:38Z) - Physics-informed neural networks for transformed geometries and
manifolds [0.0]
We propose a novel method for integrating geometric transformations within PINNs to robustly accommodate geometric variations.
We demonstrate the enhanced flexibility over traditional PINNs, especially under geometric variations.
The proposed framework presents an outlook for training deep neural operators over parametrized geometries.
arXiv Detail & Related papers (2023-11-27T15:47:33Z) - Breaking Boundaries: Distributed Domain Decomposition with Scalable
Physics-Informed Neural PDE Solvers [3.826644006708634]
We present an end-to-end parallelization of Mosaic Flow, combining data parallel training and domain parallelism for inference on large-scale problems.
Our distributed domain decomposition algorithm enables scalable inferences for solving the Laplace equation on domains 4096 times larger than the training domain.
arXiv Detail & Related papers (2023-08-28T02:25:11Z) - A DeepParticle method for learning and generating aggregation patterns
in multi-dimensional Keller-Segel chemotaxis systems [3.6184545598911724]
We study a regularized interacting particle method for computing aggregation patterns and near singular solutions of a Keller-Segal (KS) chemotaxis system in two and three space dimensions.
We further develop DeepParticle (DP) method to learn and generate solutions under variations of physical parameters.
arXiv Detail & Related papers (2022-08-31T20:52:01Z) - A Scalable Combinatorial Solver for Elastic Geometrically Consistent 3D
Shape Matching [69.14632473279651]
We present a scalable algorithm for globally optimizing over the space of geometrically consistent mappings between 3D shapes.
We propose a novel primal coupled with a Lagrange dual problem that is several orders of magnitudes faster than previous solvers.
arXiv Detail & Related papers (2022-04-27T09:47:47Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Deep convolutional neural network for shape optimization using level-set
approach [0.0]
This article presents a reduced-order modeling methodology for shape optimization applications via deep convolutional neural networks (CNNs)
A CNN-based reduced-order model (ROM) is constructed in a completely data-driven manner, and suited for non-intrusive applications.
arXiv Detail & Related papers (2022-01-17T04:41:51Z) - Revisiting Transformation Invariant Geometric Deep Learning: Are Initial
Representations All You Need? [80.86819657126041]
We show that transformation-invariant and distance-preserving initial representations are sufficient to achieve transformation invariance.
Specifically, we realize transformation-invariant and distance-preserving initial point representations by modifying multi-dimensional scaling.
We prove that TinvNN can strictly guarantee transformation invariance, being general and flexible enough to be combined with the existing neural networks.
arXiv Detail & Related papers (2021-12-23T03:52:33Z) - Transferable Model for Shape Optimization subject to Physical
Constraints [0.0]
We provide a method which enables a neural network to transform objects subject to given physical constraints.
An U-Net architecture is used to learn the underlying physical behaviour of fluid flows.
The network is used to infer the solution of flow simulations.
arXiv Detail & Related papers (2021-03-19T13:49:21Z) - Dense Non-Rigid Structure from Motion: A Manifold Viewpoint [162.88686222340962]
Non-Rigid Structure-from-Motion (NRSfM) problem aims to recover 3D geometry of a deforming object from its 2D feature correspondences across multiple frames.
We show that our approach significantly improves accuracy, scalability, and robustness against noise.
arXiv Detail & Related papers (2020-06-15T09:15:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.