Where are we in embedding spaces? A Comprehensive Analysis on Network
Embedding Approaches for Recommender Systems
- URL: http://arxiv.org/abs/2105.08908v1
- Date: Wed, 19 May 2021 03:46:41 GMT
- Title: Where are we in embedding spaces? A Comprehensive Analysis on Network
Embedding Approaches for Recommender Systems
- Authors: Sixiao Zhang, Hongxu Chen, Xiao Ming, Lizhen Cui, Hongzhi Yin,
Guandong Xu
- Abstract summary: This paper provides theoretical analysis and empirical results on when and where to use hyperbolic space and hyperbolic embeddings in recommender systems.
We evaluate our answers by comparing the performance of Euclidean space and hyperbolic space on different latent space models.
We propose a new metric learning based recommendation method called SCML and its hyperbolic version HSCML.
- Score: 30.32394422015953
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyperbolic space and hyperbolic embeddings are becoming a popular research
field for recommender systems. However, it is not clear under what
circumstances the hyperbolic space should be considered. To fill this gap, This
paper provides theoretical analysis and empirical results on when and where to
use hyperbolic space and hyperbolic embeddings in recommender systems.
Specifically, we answer the questions that which type of models and datasets
are more suited for hyperbolic space, as well as which latent size to choose.
We evaluate our answers by comparing the performance of Euclidean space and
hyperbolic space on different latent space models in both general item
recommendation domain and social recommendation domain, with 6 widely used
datasets and different latent sizes. Additionally, we propose a new metric
learning based recommendation method called SCML and its hyperbolic version
HSCML. We evaluate our conclusions regarding hyperbolic space on SCML and show
the state-of-the-art performance of hyperbolic space by comparing HSCML with
other baseline methods.
Related papers
- Modeling All Response Surfaces in One for Conditional Search Spaces [69.90317997694218]
This paper proposes a novel approach to model the response surfaces of all subspaces in one.
We introduce an attention-based deep feature extractor, capable of projecting configurations with different structures from various subspaces into a unified feature space.
arXiv Detail & Related papers (2025-01-08T03:56:06Z) - Hyperbolic Fine-tuning for Large Language Models [56.54715487997674]
This study investigates the non-Euclidean characteristics of large language models (LLMs)
We show that token embeddings exhibit a high degree of hyperbolicity, indicating a latent tree-like structure in the embedding space.
We introduce a new method called hyperbolic low-rank efficient fine-tuning, HypLoRA, that performs low-rank adaptation directly on the hyperbolic manifold.
arXiv Detail & Related papers (2024-10-05T02:58:25Z) - Knowledge-based Multiple Adaptive Spaces Fusion for Recommendation [35.20583774988951]
We propose a knowledge-based multiple adaptive spaces fusion method for recommendation, namely MCKG.
Unlike existing methods that solely adopt a specific manifold, we introduce the unified space that is compatible with hyperbolic, euclidean and spherical spaces.
In addition, we propose a geometry-aware optimization strategy which enables the pull and push processes benefited from both hyperbolic and spherical spaces.
arXiv Detail & Related papers (2023-08-29T12:11:16Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Provably Accurate and Scalable Linear Classifiers in Hyperbolic Spaces [39.71927912296049]
We propose a unified framework for learning scalable and simple hyperbolic linear classifiers.
The gist of our approach is to focus on Poincar'e ball models and formulate the classification problems using tangent space formalisms.
The excellent performance of the Poincar'e second-order and strategic perceptrons shows that the proposed framework can be extended to general machine learning problems in hyperbolic spaces.
arXiv Detail & Related papers (2022-03-07T21:36:21Z) - Switch Spaces: Learning Product Spaces with Sparse Gating [48.591045282317424]
We propose Switch Spaces, a data-driven approach for learning representations in product space.
We introduce sparse gating mechanisms that learn to choose, combine and switch spaces.
Experiments on knowledge graph completion and item recommendations show that the proposed switch space achieves new state-of-the-art performances.
arXiv Detail & Related papers (2021-02-17T11:06:59Z) - Aligning Hyperbolic Representations: an Optimal Transport-based approach [0.0]
This work proposes a novel approach based on OT of embeddings on the Poincar'e model of hyperbolic spaces.
As a result of this formalism, we derive extensions to some existing Euclidean methods of OT-based domain adaptation to their hyperbolic counterparts.
arXiv Detail & Related papers (2020-12-02T11:22:19Z) - Joint and Progressive Subspace Analysis (JPSA) with Spatial-Spectral
Manifold Alignment for Semi-Supervised Hyperspectral Dimensionality Reduction [48.73525876467408]
We propose a novel technique for hyperspectral subspace analysis.
The technique is called joint and progressive subspace analysis (JPSA)
Experiments are conducted to demonstrate the superiority and effectiveness of the proposed JPSA on two widely-used hyperspectral datasets.
arXiv Detail & Related papers (2020-09-21T16:29:59Z) - Hyperbolic Manifold Regression [33.40757136529844]
We consider the problem of performing manifold-valued regression onto an hyperbolic space as an intermediate component for a number of relevant machine learning applications.
We propose a novel perspective on two challenging tasks: 1) hierarchical classification via label embeddings and 2) taxonomy extension of hyperbolic representations.
Our experiments show that the strategy of leveraging the hyperbolic geometry is promising.
arXiv Detail & Related papers (2020-05-28T10:16:30Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.