Intrinsic Gaussian Process on Unknown Manifolds with Probabilistic
Metrics
- URL: http://arxiv.org/abs/2301.06533v1
- Date: Mon, 16 Jan 2023 17:42:40 GMT
- Title: Intrinsic Gaussian Process on Unknown Manifolds with Probabilistic
Metrics
- Authors: Mu Niu, Zhenwen Dai, Pokman Cheung, Yizhu Wang
- Abstract summary: This article presents a novel approach to construct Intrinsic Gaussian Processes for regression on unknown manifold with probabilistic metrics in point clouds.
The geometry of manifold is in general different from the usual Euclidean geometry.
The applications of GPUM are illustrated in the simulation studies on the Swiss roll, high dimensional real datasets of WiFi signals and image data examples.
- Score: 5.582101184758529
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This article presents a novel approach to construct Intrinsic Gaussian
Processes for regression on unknown manifolds with probabilistic metrics (GPUM)
in point clouds. In many real world applications, one often encounters high
dimensional data (e.g. point cloud data) centred around some lower dimensional
unknown manifolds. The geometry of manifold is in general different from the
usual Euclidean geometry. Naively applying traditional smoothing methods such
as Euclidean Gaussian Processes (GPs) to manifold valued data and so ignoring
the geometry of the space can potentially lead to highly misleading predictions
and inferences. A manifold embedded in a high dimensional Euclidean space can
be well described by a probabilistic mapping function and the corresponding
latent space. We investigate the geometrical structure of the unknown manifolds
using the Bayesian Gaussian Processes latent variable models(BGPLVM) and
Riemannian geometry. The distribution of the metric tensor is learned using
BGPLVM. The boundary of the resulting manifold is defined based on the
uncertainty quantification of the mapping. We use the the probabilistic metric
tensor to simulate Brownian Motion paths on the unknown manifold. The heat
kernel is estimated as the transition density of Brownian Motion and used as
the covariance functions of GPUM. The applications of GPUM are illustrated in
the simulation studies on the Swiss roll, high dimensional real datasets of
WiFi signals and image data examples. Its performance is compared with the
Graph Laplacian GP, Graph Matern GP and Euclidean GP.
Related papers
- Gaussian Entanglement Measure: Applications to Multipartite Entanglement
of Graph States and Bosonic Field Theory [50.24983453990065]
An entanglement measure based on the Fubini-Study metric has been recently introduced by Cocchiarella and co-workers.
We present the Gaussian Entanglement Measure (GEM), a generalization of geometric entanglement measure for multimode Gaussian states.
By providing a computable multipartite entanglement measure for systems with a large number of degrees of freedom, we show that our definition can be used to obtain insights into a free bosonic field theory.
arXiv Detail & Related papers (2024-01-31T15:50:50Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Implicit Manifold Gaussian Process Regression [49.0787777751317]
Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
arXiv Detail & Related papers (2023-10-30T09:52:48Z) - Inferring Manifolds From Noisy Data Using Gaussian Processes [17.166283428199634]
Most existing manifold learning algorithms replace the original data with lower dimensional coordinates.
This article proposes a new methodology for addressing these problems, allowing the estimated manifold between fitted data points.
arXiv Detail & Related papers (2021-10-14T15:50:38Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Probabilistic Learning Vector Quantization on Manifold of Symmetric
Positive Definite Matrices [3.727361969017079]
We develop a new classification method for manifold-valued data in the framework of probabilistic learning vector quantization.
In this paper, we generalize the probabilistic learning vector quantization algorithm for data points living on the manifold of symmetric positive definite matrices.
Empirical investigations on synthetic data, image data, and motor imagery EEG data demonstrate the superior performance of the proposed method.
arXiv Detail & Related papers (2021-02-01T06:58:39Z) - Graph Based Gaussian Processes on Restricted Domains [13.416168979487118]
In nonparametric regression, it is common for the inputs to fall in a restricted subset of Euclidean space.
We propose a new class of Graph Laplacian based GPs (GL-GPs) which learn a covariance that respects the geometry of the input domain.
We provide substantial theoretical support for the GL-GP methodology, and illustrate performance gains in various applications.
arXiv Detail & Related papers (2020-10-14T17:01:29Z) - Disentangling by Subspace Diffusion [72.1895236605335]
We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known.
Our work reduces the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.
arXiv Detail & Related papers (2020-06-23T13:33:19Z) - Linear-time inference for Gaussian Processes on one dimension [17.77516394591124]
We investigate data sampled on one dimension for which state-space models are popular due to their linearly-scaling computational costs.
We provide the first general proof of conjecture that state-space models are general, able to approximate any one-dimensional Gaussian Processes.
We develop parallelized algorithms for performing inference and learning in the LEG model, test the algorithm on real and synthetic data, and demonstrate scaling to datasets with billions of samples.
arXiv Detail & Related papers (2020-03-11T23:20:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.