Toward a Geometric Theory of Manifold Untangling
- URL: http://arxiv.org/abs/2303.04203v1
- Date: Tue, 7 Mar 2023 19:47:01 GMT
- Title: Toward a Geometric Theory of Manifold Untangling
- Authors: Xin Li and Shuo Wang
- Abstract summary: We conjecture that there is a more general solution to manifold untangling in the topological space without artificially defining any distance metric.
General strategies of both global manifold embedding and local manifold flattening are presented and connected with existing work on the untangling of image, audio, and language data.
- Score: 12.229735866373073
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: It has been hypothesized that the ventral stream processing for object
recognition is based on a mechanism called cortically local subspace
untangling. A mathematical abstraction of object recognition by the visual
cortex is how to untangle the manifolds associated with different object
category. Such a manifold untangling problem is closely related to the
celebrated kernel trick in metric space. In this paper, we conjecture that
there is a more general solution to manifold untangling in the topological
space without artificially defining any distance metric. Geometrically, we can
either $embed$ a manifold in a higher dimensional space to promote selectivity
or $flatten$ a manifold to promote tolerance. General strategies of both global
manifold embedding and local manifold flattening are presented and connected
with existing work on the untangling of image, audio, and language data. We
also discuss the implications of untangling the manifold into motor control and
internal representations.
Related papers
- Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Turing approximations, toric isometric embeddings & manifold
convolutions [0.0]
We define a convolution operator for a manifold of arbitrary topology and dimension.
A result of Alan Turing from 1938 underscores the need for such a toric isometric embedding approach to achieve a global definition of convolution.
arXiv Detail & Related papers (2021-10-05T18:36:16Z) - Statistical Mechanics of Neural Processing of Object Manifolds [3.4809730725241605]
This thesis lays the groundwork for a computational theory of neuronal processing of objects.
We identify that the capacity of a manifold is determined that effective radius, R_M, and effective dimension, D_M.
arXiv Detail & Related papers (2021-06-01T20:49:14Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Quadric hypersurface intersection for manifold learning in feature space [52.83976795260532]
manifold learning technique suitable for moderately high dimension and large datasets.
The technique is learned from the training data in the form of an intersection of quadric hypersurfaces.
At test time, this manifold can be used to introduce an outlier score for arbitrary new points.
arXiv Detail & Related papers (2021-02-11T18:52:08Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z) - Disentangling by Subspace Diffusion [72.1895236605335]
We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known.
Our work reduces the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.
arXiv Detail & Related papers (2020-06-23T13:33:19Z) - Sample complexity and effective dimension for regression on manifolds [13.774258153124205]
We consider the theory of regression on a manifold using kernel reproducing Hilbert space methods.
We show that certain spaces of smooth functions on a manifold are effectively finite-dimensional, with a complexity that scales according to the manifold dimension.
arXiv Detail & Related papers (2020-06-13T14:09:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.