A function space perspective on stochastic shape evolution
- URL: http://arxiv.org/abs/2302.05382v1
- Date: Fri, 10 Feb 2023 17:10:32 GMT
- Title: A function space perspective on stochastic shape evolution
- Authors: Elizabeth Baker and Thomas Besnier and Stefan Sommer
- Abstract summary: This paper presents a new shape model based on a description of shapes as functions in a Sobolev space.
Using an explicit orthonormal basis as a reference frame for the noise, the model is independent of the parameterisation of the noise.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modelling randomness in shape data, for example, the evolution of shapes of
organisms in biology, requires stochastic models of shapes. This paper presents
a new stochastic shape model based on a description of shapes as functions in a
Sobolev space. Using an explicit orthonormal basis as a reference frame for the
noise, the model is independent of the parameterisation of the mesh. We define
the stochastic model, explore its properties, and illustrate examples of
stochastic shape evolutions using the resulting numerical framework.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Representer Point Selection for Explaining Regularized High-dimensional
Models [105.75758452952357]
We introduce a class of sample-based explanations we term high-dimensional representers.
Our workhorse is a novel representer theorem for general regularized high-dimensional models.
We study the empirical performance of our proposed methods on three real-world binary classification datasets and two recommender system datasets.
arXiv Detail & Related papers (2023-05-31T16:23:58Z) - Landmark-free Statistical Shape Modeling via Neural Flow Deformations [0.5897108307012394]
We present FlowSSM, a novel shape modeling approach that learns shape variability without requiring dense correspondence between training instances.
Our model outperforms state-of-the-art methods in providing an expressive and robust shape prior for distal femur and liver.
arXiv Detail & Related papers (2022-09-14T18:17:19Z) - Counting Phases and Faces Using Bayesian Thermodynamic Integration [77.34726150561087]
We introduce a new approach to reconstruction of the thermodynamic functions and phase boundaries in two-parametric statistical mechanics systems.
We use the proposed approach to accurately reconstruct the partition functions and phase diagrams of the Ising model and the exactly solvable non-equilibrium TASEP.
arXiv Detail & Related papers (2022-05-18T17:11:23Z) - CaDeX: Learning Canonical Deformation Coordinate Space for Dynamic
Surface Representation via Neural Homeomorphism [46.234728261236015]
We introduce Canonical Deformation Coordinate Space (CaDeX), a unified representation of both shape and nonrigid motion.
Our novel deformation representation and its implementation are simple, efficient, and guarantee cycle consistency.
We demonstrate state-of-the-art performance in modelling a wide range of deformable objects.
arXiv Detail & Related papers (2022-03-30T17:59:23Z) - Dynamic multi feature-class Gaussian process models [0.0]
This study presents a statistical modelling method for automatic learning of shape, pose and intensity features in medical images.
A DMFC-GPM is a Gaussian process (GP)-based model with a shared latent space that encodes linear and non-linear variation.
The model performance results suggest that this new modelling paradigm is robust, accurate, accessible, and has potential applications.
arXiv Detail & Related papers (2021-12-08T15:12:47Z) - Functional additive regression on shape and form manifolds of planar
curves [0.0]
We define shape and form as equivalence classes under translation, rotation and -- for shapes -- also scale.
We extend generalized additive regression to models for the shape/form of planar curves or landmark.
arXiv Detail & Related papers (2021-09-06T17:43:32Z) - Learning Equivariant Energy Based Models with Equivariant Stein
Variational Gradient Descent [80.73580820014242]
We focus on the problem of efficient sampling and learning of probability densities by incorporating symmetries in probabilistic models.
We first introduce Equivariant Stein Variational Gradient Descent algorithm -- an equivariant sampling method based on Stein's identity for sampling from densities with symmetries.
We propose new ways of improving and scaling up training of energy based models.
arXiv Detail & Related papers (2021-06-15T01:35:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.