Vector-valued Distance and Gyrocalculus on the Space of Symmetric
Positive Definite Matrices
- URL: http://arxiv.org/abs/2110.13475v1
- Date: Tue, 26 Oct 2021 08:17:51 GMT
- Title: Vector-valued Distance and Gyrocalculus on the Space of Symmetric
Positive Definite Matrices
- Authors: Federico L\'opez, Beatrice Pozzetti, Steve Trettel, Michael Strube,
Anna Wienhard
- Abstract summary: We use the vector-valued distance to compute distances and extract geometric information from the manifold of symmetric positive definite matrices.
We develop gyrovector calculus, constructing analogs of vector space operations in this curved space.
- Score: 7.752212921476838
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: We propose the use of the vector-valued distance to compute distances and
extract geometric information from the manifold of symmetric positive definite
matrices (SPD), and develop gyrovector calculus, constructing analogs of vector
space operations in this curved space. We implement these operations and
showcase their versatility in the tasks of knowledge graph completion, item
recommendation, and question answering. In experiments, the SPD models
outperform their equivalents in Euclidean and hyperbolic space. The
vector-valued distance allows us to visualize embeddings, showing that the
models learn to disentangle representations of positive samples from negative
ones.
Related papers
- An Intrinsic Vector Heat Network [64.55434397799728]
This paper introduces a novel neural network architecture for learning tangent vector fields embedded in 3D.
We introduce a trainable vector heat diffusion module to spatially propagate vector-valued feature data across the surface.
We also demonstrate the effectiveness of our method on the useful industrial application of quadrilateral mesh generation.
arXiv Detail & Related papers (2024-06-14T00:40:31Z) - Semisupervised regression in latent structure networks on unknown
manifolds [7.5722195869569]
We consider random dot product graphs, in which an edge is formed between two nodes with probability given by the inner product of their respective latent positions.
We propose a manifold learning and graph embedding technique to predict the response variable on out-of-sample nodes.
arXiv Detail & Related papers (2023-05-04T00:41:04Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG
Signals [24.798859309715667]
We propose a new method to deal with distributions of covariance matrices.
We show that it is an efficient surrogate to the Wasserstein distance in domain adaptation for Brain Computer Interface applications.
arXiv Detail & Related papers (2023-03-10T09:08:46Z) - Error-Covariance Analysis of Monocular Pose Estimation Using Total Least
Squares [5.710183643449906]
This study presents a theoretical structure for the monocular pose estimation problem using the total least squares.
observations of the features are extracted from the monocular camera images.
The attitude and position solutions are proven to reach the Cram'er-Rao lower bound.
arXiv Detail & Related papers (2022-10-21T01:46:18Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Log-Euclidean Signatures for Intrinsic Distances Between Unaligned
Datasets [47.20862716252927]
We use manifold learning to compare the intrinsic geometric structures of different datasets.
We define a new theoretically-motivated distance based on a lower bound of the log-Euclidean metric.
arXiv Detail & Related papers (2022-02-03T16:37:23Z) - Weighting vectors for machine learning: numerical harmonic analysis
applied to boundary detection [3.8848561367220276]
We show that when the metric space is Euclidean, the weighting vector serves as an effective tool for boundary detection.
We demonstrate performance that is competitive or exceeds performance of state-of-the-art techniques on benchmark data sets.
arXiv Detail & Related papers (2021-06-01T22:14:22Z) - Analysis of Truncated Orthogonal Iteration for Sparse Eigenvector
Problems [78.95866278697777]
We propose two variants of the Truncated Orthogonal Iteration to compute multiple leading eigenvectors with sparsity constraints simultaneously.
We then apply our algorithms to solve the sparse principle component analysis problem for a wide range of test datasets.
arXiv Detail & Related papers (2021-03-24T23:11:32Z) - A Differential Geometry Perspective on Orthogonal Recurrent Models [56.09491978954866]
We employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
We show that orthogonal RNNs may be viewed as optimizing in the space of divergence-free vector fields.
Motivated by this observation, we study a new recurrent model, which spans the entire space of vector fields.
arXiv Detail & Related papers (2021-02-18T19:39:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.