Learning Manifold Implicitly via Explicit Heat-Kernel Learning
- URL: http://arxiv.org/abs/2010.01761v3
- Date: Mon, 15 Mar 2021 00:12:39 GMT
- Title: Learning Manifold Implicitly via Explicit Heat-Kernel Learning
- Authors: Yufan Zhou, Changyou Chen, Jinhui Xu
- Abstract summary: We propose the concept of implicit manifold learning, where manifold information is implicitly obtained by learning the associated heat kernel.
The learned heat kernel can be applied to various kernel-based machine learning models, including deep generative models (DGM) for data generation and Stein Variational Gradient Descent for Bayesian inference.
- Score: 63.354671267760516
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Manifold learning is a fundamental problem in machine learning with numerous
applications. Most of the existing methods directly learn the low-dimensional
embedding of the data in some high-dimensional space, and usually lack the
flexibility of being directly applicable to down-stream applications. In this
paper, we propose the concept of implicit manifold learning, where manifold
information is implicitly obtained by learning the associated heat kernel. A
heat kernel is the solution of the corresponding heat equation, which describes
how "heat" transfers on the manifold, thus containing ample geometric
information of the manifold. We provide both practical algorithm and
theoretical analysis of our framework. The learned heat kernel can be applied
to various kernel-based machine learning models, including deep generative
models (DGM) for data generation and Stein Variational Gradient Descent for
Bayesian inference. Extensive experiments show that our framework can achieve
state-of-the-art results compared to existing methods for the two tasks.
Related papers
- Practical Aspects on Solving Differential Equations Using Deep Learning: A Primer [0.0]
This primer aims to provide technical and practical insights into the Deep Galerkin method.
We demonstrate how to solve the one-dimensional heat equation step-by-step.
We also show how to apply the Deep Galerkin method to solve systems of ordinary differential equations and integral equations.
arXiv Detail & Related papers (2024-08-21T01:34:20Z) - Sketching the Heat Kernel: Using Gaussian Processes to Embed Data [4.220336689294244]
We introduce a novel, non-deterministic method for embedding data in low-dimensional Euclidean space based on realizations of a Gaussian process depending on the geometry of the data.
Our method demonstrates further advantage in its robustness to outliers.
arXiv Detail & Related papers (2024-03-01T22:56:19Z) - Unsupervised Discovery of Interpretable Directions in h-space of
Pre-trained Diffusion Models [63.1637853118899]
We propose the first unsupervised and learning-based method to identify interpretable directions in h-space of pre-trained diffusion models.
We employ a shift control module that works on h-space of pre-trained diffusion models to manipulate a sample into a shifted version of itself.
By jointly optimizing them, the model will spontaneously discover disentangled and interpretable directions.
arXiv Detail & Related papers (2023-10-15T18:44:30Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - A Heat Diffusion Perspective on Geodesic Preserving Dimensionality
Reduction [66.21060114843202]
We propose a more general heat kernel based manifold embedding method that we call heat geodesic embeddings.
Results show that our method outperforms existing state of the art in preserving ground truth manifold distances.
We also showcase our method on single cell RNA-sequencing datasets with both continuum and cluster structure.
arXiv Detail & Related papers (2023-05-30T13:58:50Z) - The Manifold Hypothesis for Gradient-Based Explanations [55.01671263121624]
gradient-based explanation algorithms provide perceptually-aligned explanations.
We show that the more a feature attribution is aligned with the tangent space of the data, the more perceptually-aligned it tends to be.
We suggest that explanation algorithms should actively strive to align their explanations with the data manifold.
arXiv Detail & Related papers (2022-06-15T08:49:24Z) - A manifold learning approach for gesture identification from
micro-Doppler radar measurements [1.4610038284393163]
We present a kernel based approximation for manifold learning that does not require the knowledge of anything about the manifold, except its dimension.
We demonstrate the performance of our approach using a publicly available micro-Doppler data set.
arXiv Detail & Related papers (2021-10-04T19:08:44Z) - Unsupervised Dense Shape Correspondence using Heat Kernels [50.682560435495034]
We propose an unsupervised method for learning dense correspondences between shapes using a recent deep functional map framework.
Instead of depending on ground-truth correspondences or the computationally expensive geodesic distances, we use heat kernels.
We present the results of our method on different benchmarks which have various challenges like partiality, topological noise and different connectivity.
arXiv Detail & Related papers (2020-10-23T21:54:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.