Pseudo-Riemannian Embedding Models for Multi-Relational Graph
Representations
- URL: http://arxiv.org/abs/2212.03720v1
- Date: Fri, 2 Dec 2022 20:37:30 GMT
- Title: Pseudo-Riemannian Embedding Models for Multi-Relational Graph
Representations
- Authors: Saee Paliwal, Angus Brayne, Benedek Fabian, Maciej Wiatrak, Aaron Sim
- Abstract summary: We generalize single-relation pseudo-Riemannian graph embedding models to multi-relational networks.
We demonstrate their use in both knowledge graph completion and knowledge discovery in a biological domain.
- Score: 4.199844472131922
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper we generalize single-relation pseudo-Riemannian graph embedding
models to multi-relational networks, and show that the typical approach of
encoding relations as manifold transformations translates from the Riemannian
to the pseudo-Riemannian case. In addition we construct a view of relations as
separate spacetime submanifolds of multi-time manifolds, and consider an
interpolation between a pseudo-Riemannian embedding model and its Wick-rotated
Riemannian counterpart. We validate these extensions in the task of link
prediction, focusing on flat Lorentzian manifolds, and demonstrate their use in
both knowledge graph completion and knowledge discovery in a biological domain.
Related papers
- Sigma Flows for Image and Data Labeling and Learning Structured Prediction [2.4699742392289]
This paper introduces the sigma flow model for the prediction of structured labelings of data observed on Riemannian manifold.
The approach combines the Laplace-Beltrami framework for image denoising and enhancement, introduced by Sochen, Kimmel and Malladi about 25 years ago, and the assignment flow approach introduced and studied by the authors.
arXiv Detail & Related papers (2024-08-28T17:04:56Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Riemannian Diffusion Models [11.306081315276089]
Diffusion models are recent state-of-the-art methods for image generation and likelihood estimation.
In this work, we generalize continuous-time diffusion models to arbitrary Riemannian manifold.
Our proposed method achieves new state-of-the-art likelihoods on all benchmarks.
arXiv Detail & Related papers (2022-08-16T21:18:31Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - Semi-Riemannian Graph Convolutional Networks [36.09315878397234]
We develop a principled Semi-Riemannian GCN that first models data in semi-Riemannian manifold of constant nonzero curvature.
Our method provides a geometric inductive bias that is sufficiently flexible to model mixed heterogeneous topologies like hierarchical graphs with cycles.
arXiv Detail & Related papers (2021-06-06T14:23:34Z) - A cortical-inspired sub-Riemannian model for Poggendorff-type visual
illusions [1.0499611180329804]
We consider Wilson-Cowan-type models for the description of orientation-dependent Poggendorff-like illusions.
Our numerical results show that the use of the sub-Riemannian kernel allows to reproduce numerically visual misperceptions and inpainting-type biases.
arXiv Detail & Related papers (2020-12-28T11:00:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.