HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization
- URL: http://arxiv.org/abs/2204.08176v1
- Date: Mon, 18 Apr 2022 06:11:44 GMT
- Title: HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization
- Authors: Menglin Yang, Min Zhou, Jiahong Liu, Defu Lian, Irwin King
- Abstract summary: We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
- Score: 52.369435664689995
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In large-scale recommender systems, the user-item networks are generally
scale-free or expand exponentially. The latent features (also known as
embeddings) used to describe the user and item are determined by how well the
embedding space fits the data distribution. Hyperbolic space offers a spacious
room to learn embeddings with its negative curvature and metric properties,
which can well fit data with tree-like structures. Recently, several hyperbolic
approaches have been proposed to learn high-quality representations for the
users and items. However, most of them concentrate on developing the hyperbolic
similitude by designing appropriate projection operations, whereas many
advantageous and exciting geometric properties of hyperbolic space have not
been explicitly explored. For example, one of the most notable properties of
hyperbolic space is that its capacity space increases exponentially with the
radius, which indicates the area far away from the hyperbolic origin is much
more embeddable. Regarding the geometric properties of hyperbolic space, we
bring up a \textit{Hyperbolic Regularization powered Collaborative Filtering}
(HRCF) and design a geometric-aware hyperbolic regularizer. Specifically, the
proposal boosts optimization procedure via the root alignment and origin-aware
penalty, which is simple yet impressively effective. Through theoretical
analysis, we further show that our proposal is able to tackle the
over-smoothing problem caused by hyperbolic aggregation and also brings the
models a better discriminative ability. We conduct extensive empirical
analysis, comparing our proposal against a large set of baselines on several
public benchmarks. The empirical results show that our approach achieves highly
competitive performance and surpasses both the leading Euclidean and hyperbolic
baselines by considerable margins. Further analysis verifies ...
Related papers
- Hyperbolic Fine-tuning for Large Language Models [56.54715487997674]
This study investigates the non-Euclidean characteristics of large language models (LLMs)
We show that token embeddings exhibit a high degree of hyperbolicity, indicating a latent tree-like structure in the embedding space.
We introduce a new method called hyperbolic low-rank efficient fine-tuning, HypLoRA, that performs low-rank adaptation directly on the hyperbolic manifold.
arXiv Detail & Related papers (2024-10-05T02:58:25Z) - Hyperbolic Delaunay Geometric Alignment [52.835250875177756]
We propose a similarity score for comparing datasets in a hyperbolic space.
The core idea is counting the edges of the hyperbolic Delaunay graph connecting datapoints across the given sets.
We provide an empirical investigation on synthetic and real-life biological data and demonstrate that HyperDGA outperforms the hyperbolic version of classical distances between sets.
arXiv Detail & Related papers (2024-04-12T17:14:58Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - HICF: Hyperbolic Informative Collaborative Filtering [35.26872278129825]
hyperbolic space is well-suited to describe the power-law distributed user-item network.
It is unclear which kinds of items can be effectively recommended by the hyperbolic model and which cannot.
We propose a novel learning method, named hyperbolic informative collaborative filtering (HICF), to compensate for the recommendation effectiveness of the head item.
arXiv Detail & Related papers (2022-07-19T03:45:38Z) - Provably Accurate and Scalable Linear Classifiers in Hyperbolic Spaces [39.71927912296049]
We propose a unified framework for learning scalable and simple hyperbolic linear classifiers.
The gist of our approach is to focus on Poincar'e ball models and formulate the classification problems using tangent space formalisms.
The excellent performance of the Poincar'e second-order and strategic perceptrons shows that the proposed framework can be extended to general machine learning problems in hyperbolic spaces.
arXiv Detail & Related papers (2022-03-07T21:36:21Z) - Where are we in embedding spaces? A Comprehensive Analysis on Network
Embedding Approaches for Recommender Systems [30.32394422015953]
This paper provides theoretical analysis and empirical results on when and where to use hyperbolic space and hyperbolic embeddings in recommender systems.
We evaluate our answers by comparing the performance of Euclidean space and hyperbolic space on different latent space models.
We propose a new metric learning based recommendation method called SCML and its hyperbolic version HSCML.
arXiv Detail & Related papers (2021-05-19T03:46:41Z) - Hyperbolic Manifold Regression [33.40757136529844]
We consider the problem of performing manifold-valued regression onto an hyperbolic space as an intermediate component for a number of relevant machine learning applications.
We propose a novel perspective on two challenging tasks: 1) hierarchical classification via label embeddings and 2) taxonomy extension of hyperbolic representations.
Our experiments show that the strategy of leveraging the hyperbolic geometry is promising.
arXiv Detail & Related papers (2020-05-28T10:16:30Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.