ChebLieNet: Invariant Spectral Graph NNs Turned Equivariant by
Riemannian Geometry on Lie Groups
- URL: http://arxiv.org/abs/2111.12139v1
- Date: Tue, 23 Nov 2021 20:19:36 GMT
- Title: ChebLieNet: Invariant Spectral Graph NNs Turned Equivariant by
Riemannian Geometry on Lie Groups
- Authors: Hugo Aguettaz, Erik J. Bekkers, Micha\"el Defferrard
- Abstract summary: ChebLieNet is a group-equivariant method on (anisotropic) manifold.
We develop a graph neural network made of anisotropic convolutional layers.
We empirically prove the existence of (data-dependent) sweet spots for anisotropic parameters on CIFAR10.
- Score: 9.195729979000404
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce ChebLieNet, a group-equivariant method on (anisotropic)
manifolds. Surfing on the success of graph- and group-based neural networks, we
take advantage of the recent developments in the geometric deep learning field
to derive a new approach to exploit any anisotropies in data. Via discrete
approximations of Lie groups, we develop a graph neural network made of
anisotropic convolutional layers (Chebyshev convolutions), spatial pooling and
unpooling layers, and global pooling layers. Group equivariance is achieved via
equivariant and invariant operators on graphs with anisotropic left-invariant
Riemannian distance-based affinities encoded on the edges. Thanks to its simple
form, the Riemannian metric can model any anisotropies, both in the spatial and
orientation domains. This control on anisotropies of the Riemannian metrics
allows to balance equivariance (anisotropic metric) against invariance
(isotropic metric) of the graph convolution layers. Hence we open the doors to
a better understanding of anisotropic properties. Furthermore, we empirically
prove the existence of (data-dependent) sweet spots for anisotropic parameters
on CIFAR10. This crucial result is evidence of the benefice we could get by
exploiting anisotropic properties in data. We also evaluate the scalability of
this approach on STL10 (image data) and ClimateNet (spherical data), showing
its remarkable adaptability to diverse tasks.
Related papers
- Geometric Generative Models based on Morphological Equivariant PDEs and GANs [3.6498648388765513]
We propose a geometric generative model based on an equivariant partial differential equation (PDE) for group convolution neural networks (G-CNNs)
The proposed geometric morphological GAN (GM-GAN) is obtained by using the proposed morphological equivariant convolutions in PDE-G-CNNs.
Preliminary results show that GM-GAN model outperforms classical GAN.
arXiv Detail & Related papers (2024-03-22T01:02:09Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Gradient-Based Feature Learning under Structured Data [57.76552698981579]
In the anisotropic setting, the commonly used spherical gradient dynamics may fail to recover the true direction.
We show that appropriate weight normalization that is reminiscent of batch normalization can alleviate this issue.
In particular, under the spiked model with a suitably large spike, the sample complexity of gradient-based training can be made independent of the information exponent.
arXiv Detail & Related papers (2023-09-07T16:55:50Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Optimization Dynamics of Equivariant and Augmented Neural Networks [2.7918308693131135]
We investigate the optimization of neural networks on symmetric data.
We compare the strategy of constraining the architecture to be equivariant to that of using data augmentation.
Our analysis reveals that even in the latter situation, stationary points may be unstable for augmented training although they are stable for the manifestly equivariant models.
arXiv Detail & Related papers (2023-03-23T17:26:12Z) - Geometric Scattering on Measure Spaces [12.0756034112778]
We introduce a general, unified model for geometric scattering on measure spaces.
We consider finite measure spaces that are obtained from randomly sampling an unknown manifold.
We propose two methods for constructing a data-driven graph on which the associated graph scattering transform approximates the scattering transform on the underlying manifold.
arXiv Detail & Related papers (2022-08-17T22:40:09Z) - Topographic VAEs learn Equivariant Capsules [84.33745072274942]
We introduce the Topographic VAE: a novel method for efficiently training deep generative models with topographically organized latent variables.
We show that such a model indeed learns to organize its activations according to salient characteristics such as digit class, width, and style on MNIST.
We demonstrate approximate equivariance to complex transformations, expanding upon the capabilities of existing group equivariant neural networks.
arXiv Detail & Related papers (2021-09-03T09:25:57Z) - Symmetry-driven graph neural networks [1.713291434132985]
We introduce two graph network architectures that are equivariant to several types of transformations affecting the node coordinates.
We demonstrate these capabilities on a synthetic dataset composed of $n$-dimensional geometric objects.
arXiv Detail & Related papers (2021-05-28T18:54:12Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.