Hyperbolic Generative Adversarial Network
- URL: http://arxiv.org/abs/2102.05567v1
- Date: Wed, 10 Feb 2021 16:55:27 GMT
- Title: Hyperbolic Generative Adversarial Network
- Authors: Diego Lazcano, Nicol\'as Fredes and Werner Creixell
- Abstract summary: We propose that it is possible to take advantage of the hierarchical characteristic present in the images by using hyperbolic neural networks in a GAN architecture.
In this study, different configurations using fully connected hyperbolic layers in the GAN, CGAN, and WGAN are tested, in what we call the HGAN, HCGAN, and HWGAN, respectively.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Recently, Hyperbolic Spaces in the context of Non-Euclidean Deep Learning
have gained popularity because of their ability to represent hierarchical data.
We propose that it is possible to take advantage of the hierarchical
characteristic present in the images by using hyperbolic neural networks in a
GAN architecture. In this study, different configurations using fully connected
hyperbolic layers in the GAN, CGAN, and WGAN are tested, in what we call the
HGAN, HCGAN, and HWGAN, respectively. The results are measured using the
Inception Score (IS) and the Fr\'echet Inception Distance (FID) on the MNIST
dataset. Depending on the configuration and space curvature, better results are
achieved for each proposed hyperbolic versions than their euclidean
counterpart.
Related papers
- On the Universal Statistical Consistency of Expansive Hyperbolic Deep Convolutional Neural Networks [14.904264782690639]
In this work, we propose Hyperbolic DCNN based on the Poincar'e Disc.
We offer extensive theoretical insights pertaining to the universal consistency of the expansive convolution in the hyperbolic space.
Results reveal that the hyperbolic convolutional architecture outperforms the Euclidean ones by a commendable margin.
arXiv Detail & Related papers (2024-11-15T12:01:03Z) - Hyperbolic Delaunay Geometric Alignment [52.835250875177756]
We propose a similarity score for comparing datasets in a hyperbolic space.
The core idea is counting the edges of the hyperbolic Delaunay graph connecting datapoints across the given sets.
We provide an empirical investigation on synthetic and real-life biological data and demonstrate that HyperDGA outperforms the hyperbolic version of classical distances between sets.
arXiv Detail & Related papers (2024-04-12T17:14:58Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Fully Hyperbolic Convolutional Neural Networks for Computer Vision [3.3964154468907486]
We present HCNN, a fully hyperbolic convolutional neural network (CNN) designed for computer vision tasks.
Based on the Lorentz model, we propose novel formulations of the convolutional layer, batch normalization, and multinomial logistic regression.
Experiments on standard vision tasks demonstrate the promising performance of our HCNN framework in both hybrid and fully hyperbolic settings.
arXiv Detail & Related papers (2023-03-28T12:20:52Z) - A Unification Framework for Euclidean and Hyperbolic Graph Neural
Networks [8.080621697426997]
Hyperbolic neural networks can effectively capture the inherent hierarchy of graph datasets.
They entangle multiple incongruent (gyro-)vector spaces within a layer, which makes them limited in terms of generalization and scalability.
We propose the Poincare disk model as our search space, and apply all approximations on the disk.
We demonstrate that our model not only leverages the power of Euclidean networks such as interpretability and efficient execution of various model components, but also outperforms both Euclidean and hyperbolic counterparts on various benchmarks.
arXiv Detail & Related papers (2022-06-09T05:33:02Z) - Optimization-Based Separations for Neural Networks [57.875347246373956]
We show that gradient descent can efficiently learn ball indicator functions using a depth 2 neural network with two layers of sigmoidal activations.
This is the first optimization-based separation result where the approximation benefits of the stronger architecture provably manifest in practice.
arXiv Detail & Related papers (2021-12-04T18:07:47Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Improvising the Learning of Neural Networks on Hyperspherical Manifold [0.0]
The impact of convolution neural networks (CNNs) in the supervised settings provided tremendous increment in performance.
The representation learned from CNN's operated on hyperspherical manifold led to insightful outcomes in face recognition.
A broad range of activation functions is developed with hypersphere intuition which performs superior to softmax in euclidean space.
arXiv Detail & Related papers (2021-09-29T22:39:07Z) - Free Hyperbolic Neural Networks with Limited Radii [32.42488915688723]
Hyperbolic Neural Networks (HNNs) that operate directly in hyperbolic space have been proposed recently to further exploit the potential of hyperbolic representations.
While HNNs have achieved better performance than Euclidean neural networks (ENNs) on datasets with implicit hierarchical structure, they still perform poorly on standard classification benchmarks such as CIFAR and ImageNet.
In this paper, we first conduct an empirical study showing that the inferior performance of HNNs on standard recognition datasets can be attributed to the notorious vanishing gradient problem.
Our analysis leads to a simple yet effective solution called Feature Clipping, which regularizes the hyperbolic embedding whenever its
arXiv Detail & Related papers (2021-07-23T22:10:16Z) - Spatial Dependency Networks: Neural Layers for Improved Generative Image
Modeling [79.15521784128102]
We introduce a novel neural network for building image generators (decoders) and apply it to variational autoencoders (VAEs)
In our spatial dependency networks (SDNs), feature maps at each level of a deep neural net are computed in a spatially coherent way.
We show that augmenting the decoder of a hierarchical VAE by spatial dependency layers considerably improves density estimation.
arXiv Detail & Related papers (2021-03-16T07:01:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.