Overlapping Spaces for Compact Graph Representations
- URL: http://arxiv.org/abs/2007.02445v3
- Date: Fri, 8 Apr 2022 14:00:35 GMT
- Title: Overlapping Spaces for Compact Graph Representations
- Authors: Kirill Shevkunov and Liudmila Prokhorenkova
- Abstract summary: Various non-trivial spaces are becoming popular for embedding structured data such as graphs, texts, or images.
We generalize the concept of product space and introduce an overlapping space that does not have the configuration search problem.
- Score: 17.919759296265
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Various non-trivial spaces are becoming popular for embedding structured data
such as graphs, texts, or images. Following spherical and hyperbolic spaces,
more general product spaces have been proposed. However, searching for the best
configuration of product space is a resource-intensive procedure, which reduces
the practical applicability of the idea. We generalize the concept of product
space and introduce an overlapping space that does not have the configuration
search problem. The main idea is to allow subsets of coordinates to be shared
between spaces of different types (Euclidean, hyperbolic, spherical). As a
result, parameter optimization automatically learns the optimal configuration.
Additionally, overlapping spaces allow for more compact representations since
their geometry is more complex. Our experiments confirm that overlapping spaces
outperform the competitors in graph embedding tasks. Here, we consider both
distortion setup, where the aim is to preserve distances, and ranking setup,
where the relative order should be preserved. The proposed method effectively
solves the problem and outperforms the competitors in both settings. We also
perform an empirical analysis in a realistic information retrieval task, where
we compare all spaces by incorporating them into DSSM. In this case, the
proposed overlapping space consistently achieves nearly optimal results without
any configuration tuning. This allows for reducing training time, which can be
significant in large-scale applications.
Related papers
- Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Principal Component Analysis in Space Forms [7.822210329345704]
We study Principal Component Analysis (PCA) in space forms.
Finding the optimal low-dimensional affine subspace for given points in a space form amounts to dimensionality reduction.
We propose proper cost functions that enjoy two properties: (1) their optimal affine subspace is the solution to an eigenequation, and (2) optimal affine subspaces of different dimensions form a nested set.
arXiv Detail & Related papers (2023-01-06T23:48:37Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Provably Accurate and Scalable Linear Classifiers in Hyperbolic Spaces [39.71927912296049]
We propose a unified framework for learning scalable and simple hyperbolic linear classifiers.
The gist of our approach is to focus on Poincar'e ball models and formulate the classification problems using tangent space formalisms.
The excellent performance of the Poincar'e second-order and strategic perceptrons shows that the proposed framework can be extended to general machine learning problems in hyperbolic spaces.
arXiv Detail & Related papers (2022-03-07T21:36:21Z) - Highly Scalable and Provably Accurate Classification in Poincare Balls [40.82908295137667]
We establish a unified framework for learning scalable and simple hyperbolic linear classifiers with provable performance guarantees.
Our results include a new hyperbolic and second-order perceptron algorithm as well as an efficient and highly accurate convex optimization setup for hyperbolic support vector machine classifiers.
Their performance accuracies on synthetic data sets comprising millions of points, as well as on complex real-world data sets such as single-cell RNA-seq expression measurements, CIFAR10, Fashion-MNIST and mini-ImageNet.
arXiv Detail & Related papers (2021-09-08T16:59:39Z) - Switch Spaces: Learning Product Spaces with Sparse Gating [48.591045282317424]
We propose Switch Spaces, a data-driven approach for learning representations in product space.
We introduce sparse gating mechanisms that learn to choose, combine and switch spaces.
Experiments on knowledge graph completion and item recommendations show that the proposed switch space achieves new state-of-the-art performances.
arXiv Detail & Related papers (2021-02-17T11:06:59Z) - Joint and Progressive Subspace Analysis (JPSA) with Spatial-Spectral
Manifold Alignment for Semi-Supervised Hyperspectral Dimensionality Reduction [48.73525876467408]
We propose a novel technique for hyperspectral subspace analysis.
The technique is called joint and progressive subspace analysis (JPSA)
Experiments are conducted to demonstrate the superiority and effectiveness of the proposed JPSA on two widely-used hyperspectral datasets.
arXiv Detail & Related papers (2020-09-21T16:29:59Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z) - Learnable Subspace Clustering [76.2352740039615]
We develop a learnable subspace clustering paradigm to efficiently solve the large-scale subspace clustering problem.
The key idea is to learn a parametric function to partition the high-dimensional subspaces into their underlying low-dimensional subspaces.
To the best of our knowledge, this paper is the first work to efficiently cluster millions of data points among the subspace clustering methods.
arXiv Detail & Related papers (2020-04-09T12:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.