Hyperbolic Convolution via Kernel Point Aggregation
- URL: http://arxiv.org/abs/2306.08862v1
- Date: Thu, 15 Jun 2023 05:15:13 GMT
- Title: Hyperbolic Convolution via Kernel Point Aggregation
- Authors: Eric Qu, Dongmian Zou
- Abstract summary: We propose HKConv, a novel trainable hyperbolic convolution which first correlates trainable local hyperbolic features with fixed kernel points placed in the hyperbolic space.
We show that neural networks with HKConv layers advance state-of-the-art in various tasks.
- Score: 4.061135251278187
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning representations according to the underlying geometry is of vital
importance for non-Euclidean data. Studies have revealed that the hyperbolic
space can effectively embed hierarchical or tree-like data. In particular, the
few past years have witnessed a rapid development of hyperbolic neural
networks. However, it is challenging to learn good hyperbolic representations
since common Euclidean neural operations, such as convolution, do not extend to
the hyperbolic space. Most hyperbolic neural networks do not embrace the
convolution operation and ignore local patterns. Others either only use
non-hyperbolic convolution, or miss essential properties such as equivariance
to permutation. We propose HKConv, a novel trainable hyperbolic convolution
which first correlates trainable local hyperbolic features with fixed kernel
points placed in the hyperbolic space, then aggregates the output features
within a local neighborhood. HKConv not only expressively learns local features
according to the hyperbolic geometry, but also enjoys equivariance to
permutation of hyperbolic points and invariance to parallel transport of a
local neighborhood. We show that neural networks with HKConv layers advance
state-of-the-art in various tasks.
Related papers
- Hyperbolic Delaunay Geometric Alignment [52.835250875177756]
We propose a similarity score for comparing datasets in a hyperbolic space.
The core idea is counting the edges of the hyperbolic Delaunay graph connecting datapoints across the given sets.
We provide an empirical investigation on synthetic and real-life biological data and demonstrate that HyperDGA outperforms the hyperbolic version of classical distances between sets.
arXiv Detail & Related papers (2024-04-12T17:14:58Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - A Unification Framework for Euclidean and Hyperbolic Graph Neural
Networks [8.080621697426997]
Hyperbolic neural networks can effectively capture the inherent hierarchy of graph datasets.
They entangle multiple incongruent (gyro-)vector spaces within a layer, which makes them limited in terms of generalization and scalability.
We propose the Poincare disk model as our search space, and apply all approximations on the disk.
We demonstrate that our model not only leverages the power of Euclidean networks such as interpretability and efficient execution of various model components, but also outperforms both Euclidean and hyperbolic counterparts on various benchmarks.
arXiv Detail & Related papers (2022-06-09T05:33:02Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Hyperbolic Vision Transformers: Combining Improvements in Metric
Learning [116.13290702262248]
We propose a new hyperbolic-based model for metric learning.
At the core of our method is a vision transformer with output embeddings mapped to hyperbolic space.
We evaluate the proposed model with six different formulations on four datasets.
arXiv Detail & Related papers (2022-03-21T09:48:23Z) - HyLa: Hyperbolic Laplacian Features For Graph Learning [44.33054069927441]
hyperbolic space can support embeddings of tree- and graph-structured data.
For graph learning, points in hyperbolic space have been used successfully as signals in deep neural networks.
Existing hyperbolic networks are computationally expensive and can be numerically unstable.
We propose HyLa, a completely different approach to using hyperbolic space in graph learning.
arXiv Detail & Related papers (2022-02-14T16:40:24Z) - Nested Hyperbolic Spaces for Dimensionality Reduction and Hyperbolic NN
Design [8.250374560598493]
Hyperbolic neural networks have been popular in the recent past due to their ability to represent hierarchical data sets effectively and efficiently.
The challenge in developing these networks lies in the nonlinearity of the embedding space namely, the Hyperbolic space.
We present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) followed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space.
arXiv Detail & Related papers (2021-12-03T03:20:27Z) - Fully Hyperbolic Neural Networks [63.22521652077353]
We propose a fully hyperbolic framework to build hyperbolic networks based on the Lorentz model.
We show that our method has better performance for building both shallow and deep networks.
arXiv Detail & Related papers (2021-05-31T03:36:49Z) - A Hyperbolic-to-Hyperbolic Graph Convolutional Network [46.80564170208473]
We propose a hyperbolic-to-hyperbolic graph convolutional network (H2H-GCN) that directly works on hyperbolic manifold.
The H2H-GCN achieves substantial improvements on the link prediction, node classification, and graph classification tasks.
arXiv Detail & Related papers (2021-04-14T16:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.