Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning
- URL: http://arxiv.org/abs/2204.06645v1
- Date: Wed, 13 Apr 2022 21:43:28 GMT
- Title: Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning
- Authors: Keaton Hamm, Nick Henscheid, Shujie Kang
- Abstract summary: We propose Wasserstein Isometric Mapping (Wassmap) as a parameter-free nonlinear dimensionality reduction technique.
Wassmap represents images via probability measures in Wasserstein space, then uses pairwise quadratic Wasserstein distances between the associated measures to produce a low-dimensional, approximately isometric embedding.
- Score: 0.7734726150561088
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose Wasserstein Isometric Mapping (Wassmap), a
parameter-free nonlinear dimensionality reduction technique that provides
solutions to some drawbacks in existing global nonlinear dimensionality
reduction algorithms in imaging applications. Wassmap represents images via
probability measures in Wasserstein space, then uses pairwise quadratic
Wasserstein distances between the associated measures to produce a
low-dimensional, approximately isometric embedding. We show that the algorithm
is able to exactly recover parameters of some image manifolds including those
generated by translations or dilations of a fixed generating measure.
Additionally, we show that a discrete version of the algorithm retrieves
parameters from manifolds generated from discrete measures by providing a
theoretical bridge to transfer recovery results from functional data to
discrete data. Testing of the proposed algorithms on various image data
manifolds show that Wassmap yields good embeddings compared with other global
techniques.
Related papers
- A dimensionality reduction technique based on the Gromov-Wasserstein distance [7.8772082926712415]
We propose a new method for dimensionality reduction based on optimal transportation theory and the Gromov-Wasserstein distance.
Our method embeds high-dimensional data into a lower-dimensional space, providing a robust and efficient solution for analyzing complex high-dimensional datasets.
arXiv Detail & Related papers (2025-01-23T15:05:51Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Partial Symmetry Detection for 3D Geometry using Contrastive Learning
with Geodesic Point Cloud Patches [10.48309709793733]
We propose to learn rotation, reflection, translation and scale invariant local shape features for geodesic point cloud patches.
We show that our approach is able to extract multiple valid solutions for this ambiguous problem.
We incorporate the detected symmetries together with a region growing algorithm to demonstrate a downstream task.
arXiv Detail & Related papers (2023-12-13T15:48:50Z) - Linearized Wasserstein dimensionality reduction with approximation
guarantees [65.16758672591365]
LOT Wassmap is a computationally feasible algorithm to uncover low-dimensional structures in the Wasserstein space.
We show that LOT Wassmap attains correct embeddings and that the quality improves with increased sample size.
We also show how LOT Wassmap significantly reduces the computational cost when compared to algorithms that depend on pairwise distance computations.
arXiv Detail & Related papers (2023-02-14T22:12:16Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Incorporating Texture Information into Dimensionality Reduction for
High-Dimensional Images [65.74185962364211]
We present a method for incorporating neighborhood information into distance-based dimensionality reduction methods.
Based on a classification of different methods for comparing image patches, we explore a number of different approaches.
arXiv Detail & Related papers (2022-02-18T13:17:43Z) - Manifold learning-based polynomial chaos expansions for high-dimensional
surrogate models [0.0]
We introduce a manifold learning-based method for uncertainty quantification (UQ) in describing systems.
The proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
arXiv Detail & Related papers (2021-07-21T00:24:15Z) - Learned Block Iterative Shrinkage Thresholding Algorithm for
Photothermal Super Resolution Imaging [52.42007686600479]
We propose a learned block-sparse optimization approach using an iterative algorithm unfolded into a deep neural network.
We show the benefits of using a learned block iterative shrinkage thresholding algorithm that is able to learn the choice of regularization parameters.
arXiv Detail & Related papers (2020-12-07T09:27:16Z) - Pixel-Pair Occlusion Relationship Map(P2ORM): Formulation, Inference &
Application [20.63938300312815]
We formalize concepts around geometric occlusion in 2D images (i.e., ignoring semantics)
We propose a novel unified formulation of both occlusion boundaries and occlusion orientations via a pixel-pair occlusion relation.
Experiments on a variety of datasets demonstrate that our method outperforms existing ones on this task.
We also propose a new depth map refinement method that consistently improve the performance of state-of-the-art monocular depth estimation methods.
arXiv Detail & Related papers (2020-07-23T15:52:09Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.