Differentiating through the Fr\'echet Mean
- URL: http://arxiv.org/abs/2003.00335v4
- Date: Mon, 5 Jul 2021 23:47:42 GMT
- Title: Differentiating through the Fr\'echet Mean
- Authors: Aaron Lou, Isay Katsman, Qingxuan Jiang, Serge Belongie, Ser-Nam Lim,
Christopher De Sa
- Abstract summary: Fr'echet mean is a generalization of the Euclidean mean.
We show how to differentiate through the Fr'echet mean for arbitrary Riemannian manifold.
This fully integrates the Fr'echet mean into the hyperbolic neural network pipeline.
- Score: 51.32291896926807
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advances in deep representation learning on Riemannian manifolds
extend classical deep learning operations to better capture the geometry of the
manifold. One possible extension is the Fr\'echet mean, the generalization of
the Euclidean mean; however, it has been difficult to apply because it lacks a
closed form with an easily computable derivative. In this paper, we show how to
differentiate through the Fr\'echet mean for arbitrary Riemannian manifolds.
Then, focusing on hyperbolic space, we derive explicit gradient expressions and
a fast, accurate, and hyperparameter-free Fr\'echet mean solver. This fully
integrates the Fr\'echet mean into the hyperbolic neural network pipeline. To
demonstrate this integration, we present two case studies. First, we apply our
Fr\'echet mean to the existing Hyperbolic Graph Convolutional Network,
replacing its projected aggregation to obtain state-of-the-art results on
datasets with high hyperbolicity. Second, to demonstrate the Fr\'echet mean's
capacity to generalize Euclidean neural network operations, we develop a
hyperbolic batch normalization method that gives an improvement parallel to the
one observed in the Euclidean setting.
Related papers
- Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - Hyperbolic Representation Learning: Revisiting and Advancing [43.1661098138936]
We introduce a position-tracking mechanism to scrutinize existing prevalent hlms, revealing that the learned representations are sub-optimal and unsatisfactory.
We propose a simple yet effective method, hyperbolic informed embedding (HIE), by incorporating cost-free hierarchical information deduced from the hyperbolic distance of the node to origin.
Our method achieves a remarkable improvement of up to 21.4% compared to the competing baselines.
arXiv Detail & Related papers (2023-06-15T13:25:39Z) - FFHR: Fully and Flexible Hyperbolic Representation for Knowledge Graph
Completion [45.470475498688344]
Some important operations in hyperbolic space still lack good definitions, making existing methods unable to fully leverage the merits of hyperbolic space.
We develop a textbfFully and textbfFlexible textbfHyperbolic textbfRepresentation framework (textbfFFHR) that is able to transfer recent Euclidean-based advances to hyperbolic space.
arXiv Detail & Related papers (2023-02-07T14:50:28Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Fully Hyperbolic Neural Networks [63.22521652077353]
We propose a fully hyperbolic framework to build hyperbolic networks based on the Lorentz model.
We show that our method has better performance for building both shallow and deep networks.
arXiv Detail & Related papers (2021-05-31T03:36:49Z) - Hyperbolic Manifold Regression [33.40757136529844]
We consider the problem of performing manifold-valued regression onto an hyperbolic space as an intermediate component for a number of relevant machine learning applications.
We propose a novel perspective on two challenging tasks: 1) hierarchical classification via label embeddings and 2) taxonomy extension of hyperbolic representations.
Our experiments show that the strategy of leveraging the hyperbolic geometry is promising.
arXiv Detail & Related papers (2020-05-28T10:16:30Z) - Latent Variable Modelling with Hyperbolic Normalizing Flows [35.1659722563025]
We introduce a novel normalizing flow over hyperbolic VAEs and Euclidean normalizing flows.
Our approach achieves improved performance on density estimation, as well as reconstruction of real-world graph data.
arXiv Detail & Related papers (2020-02-15T07:44:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.