Probabilistic Learning Vector Quantization on Manifold of Symmetric
Positive Definite Matrices
- URL: http://arxiv.org/abs/2102.00667v1
- Date: Mon, 1 Feb 2021 06:58:39 GMT
- Title: Probabilistic Learning Vector Quantization on Manifold of Symmetric
Positive Definite Matrices
- Authors: Fengzhen Tang, Haifeng Feng, Peter Tino, Bailu Si, Daxiong Ji
- Abstract summary: We develop a new classification method for manifold-valued data in the framework of probabilistic learning vector quantization.
In this paper, we generalize the probabilistic learning vector quantization algorithm for data points living on the manifold of symmetric positive definite matrices.
Empirical investigations on synthetic data, image data, and motor imagery EEG data demonstrate the superior performance of the proposed method.
- Score: 3.727361969017079
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In this paper, we develop a new classification method for manifold-valued
data in the framework of probabilistic learning vector quantization. In many
classification scenarios, the data can be naturally represented by symmetric
positive definite matrices, which are inherently points that live on a curved
Riemannian manifold. Due to the non-Euclidean geometry of Riemannian manifolds,
traditional Euclidean machine learning algorithms yield poor results on such
data. In this paper, we generalize the probabilistic learning vector
quantization algorithm for data points living on the manifold of symmetric
positive definite matrices equipped with Riemannian natural metric
(affine-invariant metric). By exploiting the induced Riemannian distance, we
derive the probabilistic learning Riemannian space quantization algorithm,
obtaining the learning rule through Riemannian gradient descent. Empirical
investigations on synthetic data, image data , and motor imagery EEG data
demonstrate the superior performance of the proposed method.
Related papers
- Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - Intrinsic Gaussian Process on Unknown Manifolds with Probabilistic
Metrics [5.582101184758529]
This article presents a novel approach to construct Intrinsic Gaussian Processes for regression on unknown manifold with probabilistic metrics in point clouds.
The geometry of manifold is in general different from the usual Euclidean geometry.
The applications of GPUM are illustrated in the simulation studies on the Swiss roll, high dimensional real datasets of WiFi signals and image data examples.
arXiv Detail & Related papers (2023-01-16T17:42:40Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Geometric Learning of Hidden Markov Models via a Method of Moments
Algorithm [11.338112397748619]
We present a novel algorithm for learning the parameters of hidden Markov models (HMMs) in a geometric setting.
We demonstrate through examples that the learner can result in significantly improved speed and numerical accuracy compared to existing learners.
arXiv Detail & Related papers (2022-07-02T12:24:38Z) - Riemannian Metric Learning via Optimal Transport [34.557360177483595]
We introduce an optimal transport-based model for learning a metric from cross-sectional samples of evolving probability measures.
We show that metrics learned using our method improve the quality of trajectory inference on scRNA and bird migration data.
arXiv Detail & Related papers (2022-05-18T23:32:20Z) - Robust Geometric Metric Learning [17.855338784378]
This paper proposes new algorithms for the metric learning problem.
A general approach, called Robust Geometric Metric Learning (RGML), is then studied.
The performance of RGML is asserted on real datasets.
arXiv Detail & Related papers (2022-02-23T14:55:08Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - Automatic differentiation for Riemannian optimization on low-rank matrix
and tensor-train manifolds [71.94111815357064]
In scientific computing and machine learning applications, matrices and more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions.
One of the popular tools for finding the low-rank approximations is to use the Riemannian optimization.
arXiv Detail & Related papers (2021-03-27T19:56:00Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Asymptotic Analysis of an Ensemble of Randomly Projected Linear
Discriminants [94.46276668068327]
In [1], an ensemble of randomly projected linear discriminants is used to classify datasets.
We develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator.
We also demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
arXiv Detail & Related papers (2020-04-17T12:47:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.