Where are we in embedding spaces? A Comprehensive Analysis on Network
Embedding Approaches for Recommender Systems
- URL: http://arxiv.org/abs/2105.08908v1
- Date: Wed, 19 May 2021 03:46:41 GMT
- Title: Where are we in embedding spaces? A Comprehensive Analysis on Network
Embedding Approaches for Recommender Systems
- Authors: Sixiao Zhang, Hongxu Chen, Xiao Ming, Lizhen Cui, Hongzhi Yin,
Guandong Xu
- Abstract summary: This paper provides theoretical analysis and empirical results on when and where to use hyperbolic space and hyperbolic embeddings in recommender systems.
We evaluate our answers by comparing the performance of Euclidean space and hyperbolic space on different latent space models.
We propose a new metric learning based recommendation method called SCML and its hyperbolic version HSCML.
- Score: 30.32394422015953
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyperbolic space and hyperbolic embeddings are becoming a popular research
field for recommender systems. However, it is not clear under what
circumstances the hyperbolic space should be considered. To fill this gap, This
paper provides theoretical analysis and empirical results on when and where to
use hyperbolic space and hyperbolic embeddings in recommender systems.
Specifically, we answer the questions that which type of models and datasets
are more suited for hyperbolic space, as well as which latent size to choose.
We evaluate our answers by comparing the performance of Euclidean space and
hyperbolic space on different latent space models in both general item
recommendation domain and social recommendation domain, with 6 widely used
datasets and different latent sizes. Additionally, we propose a new metric
learning based recommendation method called SCML and its hyperbolic version
HSCML. We evaluate our conclusions regarding hyperbolic space on SCML and show
the state-of-the-art performance of hyperbolic space by comparing HSCML with
other baseline methods.
Related papers
- On Probabilistic Pullback Metrics on Latent Hyperbolic Manifolds [5.724027955589408]
This paper focuses on the hyperbolic manifold, a particularly suitable choice for modeling hierarchical relationships.
We propose augmenting the hyperbolic metric with a pullback metric to account for distortions introduced by theVM's nonlinear mapping.
Through various experiments, we demonstrate that geodesics on the pullback metric not only respect the geometry of the hyperbolic latent space but also align with the underlying data distribution.
arXiv Detail & Related papers (2024-10-28T09:13:00Z) - Hyperbolic Fine-tuning for Large Language Models [56.54715487997674]
This study investigates the non-Euclidean characteristics of large language models (LLMs)
We show that token embeddings exhibit a high degree of hyperbolicity, indicating a latent tree-like structure in the embedding space.
We introduce a new method called hyperbolic low-rank efficient fine-tuning, HypLoRA, that performs low-rank adaptation directly on the hyperbolic manifold.
arXiv Detail & Related papers (2024-10-05T02:58:25Z) - Dynamic Hyperbolic Attention Network for Fine Hand-object Reconstruction [76.5549647815413]
We propose the first precise hand-object reconstruction method in hyperbolic space, namely Dynamic Hyperbolic Attention Network (DHANet)
Our method learns mesh features with rich geometry-image multi-modal information and models better hand-object interaction.
arXiv Detail & Related papers (2023-09-06T13:00:10Z) - Knowledge-based Multiple Adaptive Spaces Fusion for Recommendation [35.20583774988951]
We propose a knowledge-based multiple adaptive spaces fusion method for recommendation, namely MCKG.
Unlike existing methods that solely adopt a specific manifold, we introduce the unified space that is compatible with hyperbolic, euclidean and spherical spaces.
In addition, we propose a geometry-aware optimization strategy which enables the pull and push processes benefited from both hyperbolic and spherical spaces.
arXiv Detail & Related papers (2023-08-29T12:11:16Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Provably Accurate and Scalable Linear Classifiers in Hyperbolic Spaces [39.71927912296049]
We propose a unified framework for learning scalable and simple hyperbolic linear classifiers.
The gist of our approach is to focus on Poincar'e ball models and formulate the classification problems using tangent space formalisms.
The excellent performance of the Poincar'e second-order and strategic perceptrons shows that the proposed framework can be extended to general machine learning problems in hyperbolic spaces.
arXiv Detail & Related papers (2022-03-07T21:36:21Z) - Switch Spaces: Learning Product Spaces with Sparse Gating [48.591045282317424]
We propose Switch Spaces, a data-driven approach for learning representations in product space.
We introduce sparse gating mechanisms that learn to choose, combine and switch spaces.
Experiments on knowledge graph completion and item recommendations show that the proposed switch space achieves new state-of-the-art performances.
arXiv Detail & Related papers (2021-02-17T11:06:59Z) - Aligning Hyperbolic Representations: an Optimal Transport-based approach [0.0]
This work proposes a novel approach based on OT of embeddings on the Poincar'e model of hyperbolic spaces.
As a result of this formalism, we derive extensions to some existing Euclidean methods of OT-based domain adaptation to their hyperbolic counterparts.
arXiv Detail & Related papers (2020-12-02T11:22:19Z) - Joint and Progressive Subspace Analysis (JPSA) with Spatial-Spectral
Manifold Alignment for Semi-Supervised Hyperspectral Dimensionality Reduction [48.73525876467408]
We propose a novel technique for hyperspectral subspace analysis.
The technique is called joint and progressive subspace analysis (JPSA)
Experiments are conducted to demonstrate the superiority and effectiveness of the proposed JPSA on two widely-used hyperspectral datasets.
arXiv Detail & Related papers (2020-09-21T16:29:59Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.