Enhance Hyperbolic Representation Learning via Second-order Pooling
- URL: http://arxiv.org/abs/2410.22026v1
- Date: Tue, 29 Oct 2024 13:17:43 GMT
- Title: Enhance Hyperbolic Representation Learning via Second-order Pooling
- Authors: Kun Song, Ruben Solozabal, Li hao, Lu Ren, Moloud Abdar, Qing Li, Fakhri Karray, Martin Takac,
- Abstract summary: We introduce second-order pooling into hyperbolic representation learning.
It naturally increases the distance between samples without compromising the generalization ability of the input features.
We propose a kernel approximation regularization, which enables the low-dimensional bilinear features to approximate the kernel function well in low-dimensional space.
- Score: 8.798965454017988
- License:
- Abstract: Hyperbolic representation learning is well known for its ability to capture hierarchical information. However, the distance between samples from different levels of hierarchical classes can be required large. We reveal that the hyperbolic discriminant objective forces the backbone to capture this hierarchical information, which may inevitably increase the Lipschitz constant of the backbone. This can hinder the full utilization of the backbone's generalization ability. To address this issue, we introduce second-order pooling into hyperbolic representation learning, as it naturally increases the distance between samples without compromising the generalization ability of the input features. In this way, the Lipschitz constant of the backbone does not necessarily need to be large. However, current off-the-shelf low-dimensional bilinear pooling methods cannot be directly employed in hyperbolic representation learning because they inevitably reduce the distance expansion capability. To solve this problem, we propose a kernel approximation regularization, which enables the low-dimensional bilinear features to approximate the kernel function well in low-dimensional space. Finally, we conduct extensive experiments on graph-structured datasets to demonstrate the effectiveness of the proposed method.
Related papers
- Accelerating hyperbolic t-SNE [7.411478341945197]
This paper introduces the first acceleration structure for hyperbolic embeddings, building upon a polar quadtree.
We demonstrate that it computes embeddings of similar quality in significantly less time.
arXiv Detail & Related papers (2024-01-23T12:59:40Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - HMSN: Hyperbolic Self-Supervised Learning by Clustering with Ideal
Prototypes [7.665392786787577]
We use hyperbolic representation space for self-supervised representation learning for prototype-based clustering approaches.
We extend the Masked Siamese Networks to operate on the Poincar'e ball model of hyperbolic space.
Unlike previous methods we project to the hyperbolic space at the output of the encoder network and utilise a hyperbolic projection head to ensure that the representations used for downstream tasks remain hyperbolic.
arXiv Detail & Related papers (2023-05-18T12:38:40Z) - FFHR: Fully and Flexible Hyperbolic Representation for Knowledge Graph
Completion [45.470475498688344]
Some important operations in hyperbolic space still lack good definitions, making existing methods unable to fully leverage the merits of hyperbolic space.
We develop a textbfFully and textbfFlexible textbfHyperbolic textbfRepresentation framework (textbfFFHR) that is able to transfer recent Euclidean-based advances to hyperbolic space.
arXiv Detail & Related papers (2023-02-07T14:50:28Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - On the Benefits of Large Learning Rates for Kernel Methods [110.03020563291788]
We show that a phenomenon can be precisely characterized in the context of kernel methods.
We consider the minimization of a quadratic objective in a separable Hilbert space, and show that with early stopping, the choice of learning rate influences the spectral decomposition of the obtained solution.
arXiv Detail & Related papers (2022-02-28T13:01:04Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z) - Differentiating through the Fr\'echet Mean [51.32291896926807]
Fr'echet mean is a generalization of the Euclidean mean.
We show how to differentiate through the Fr'echet mean for arbitrary Riemannian manifold.
This fully integrates the Fr'echet mean into the hyperbolic neural network pipeline.
arXiv Detail & Related papers (2020-02-29T19:49:38Z) - Latent Variable Modelling with Hyperbolic Normalizing Flows [35.1659722563025]
We introduce a novel normalizing flow over hyperbolic VAEs and Euclidean normalizing flows.
Our approach achieves improved performance on density estimation, as well as reconstruction of real-world graph data.
arXiv Detail & Related papers (2020-02-15T07:44:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.