Deep Signatures -- Learning Invariants of Planar Curves
- URL: http://arxiv.org/abs/2202.05922v1
- Date: Fri, 11 Feb 2022 22:34:15 GMT
- Title: Deep Signatures -- Learning Invariants of Planar Curves
- Authors: Roy Velich, Ron Kimmel
- Abstract summary: We propose a learning paradigm for numerical approximation of differential invariants of planar curves.
Deep neural-networks' (DNNs) universal approximation properties are utilized to estimate geometric measures.
- Score: 12.699486382844393
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a learning paradigm for numerical approximation of differential
invariants of planar curves. Deep neural-networks' (DNNs) universal
approximation properties are utilized to estimate geometric measures. The
proposed framework is shown to be a preferable alternative to axiomatic
constructions. Specifically, we show that DNNs can learn to overcome
instabilities and sampling artifacts and produce numerically-stable signatures
for curves subject to a given group of transformations in the plane. We compare
the proposed schemes to alternative state-of-the-art axiomatic constructions of
group invariant arc-lengths and curvatures.
Related papers
- Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - A Unified Theory of Stochastic Proximal Point Methods without Smoothness [52.30944052987393]
Proximal point methods have attracted considerable interest owing to their numerical stability and robustness against imperfect tuning.
This paper presents a comprehensive analysis of a broad range of variations of the proximal point method (SPPM)
arXiv Detail & Related papers (2024-05-24T21:09:19Z) - Learning Differential Invariants of Planar Curves [12.699486382844393]
We propose a learning paradigm for the numerical approximation of differential invariants of planar curves.
Deep neural-networks' (DNNs) universal approximation properties are utilized to estimate geometric measures.
arXiv Detail & Related papers (2023-03-06T19:30:43Z) - Geometric Scattering on Measure Spaces [12.0756034112778]
We introduce a general, unified model for geometric scattering on measure spaces.
We consider finite measure spaces that are obtained from randomly sampling an unknown manifold.
We propose two methods for constructing a data-driven graph on which the associated graph scattering transform approximates the scattering transform on the underlying manifold.
arXiv Detail & Related papers (2022-08-17T22:40:09Z) - Geometric variational inference [0.0]
Variational Inference (VI) or Markov-Chain Monte-Carlo (MCMC) techniques are used to go beyond point estimates.
This work proposes geometric Variational Inference (geoVI), a method based on Riemannian geometry and the Fisher information metric.
The distribution, expressed in the coordinate system induced by the transformation, takes a particularly simple form that allows for an accurate variational approximation.
arXiv Detail & Related papers (2021-05-21T17:18:50Z) - A Differential Geometry Perspective on Orthogonal Recurrent Models [56.09491978954866]
We employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
We show that orthogonal RNNs may be viewed as optimizing in the space of divergence-free vector fields.
Motivated by this observation, we study a new recurrent model, which spans the entire space of vector fields.
arXiv Detail & Related papers (2021-02-18T19:39:22Z) - Tractable structured natural gradient descent using local
parameterizations [43.51581051770027]
Natural-gradient descent on structured parameter spaces is computationally challenging due to complicated inverse Fisher-matrix computations.
We address this issue by using emphlocal- parameter coordinates.
We show results on a range of applications on deep learning, variational inference, and evolution strategies.
arXiv Detail & Related papers (2021-02-15T09:09:20Z) - Efficient Semi-Implicit Variational Inference [65.07058307271329]
We propose an efficient and scalable semi-implicit extrapolational (SIVI)
Our method maps SIVI's evidence to a rigorous inference of lower gradient values.
arXiv Detail & Related papers (2021-01-15T11:39:09Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.