A Wigner-Eckart Theorem for Group Equivariant Convolution Kernels
- URL: http://arxiv.org/abs/2010.10952v4
- Date: Thu, 21 Jan 2021 10:00:28 GMT
- Title: A Wigner-Eckart Theorem for Group Equivariant Convolution Kernels
- Authors: Leon Lang, Maurice Weiler
- Abstract summary: Group equivariant convolutional networks (GCNNs) endow classical convolutional networks with additional symmetry priors.
Recent advances in the theoretical description of GCNNs revealed that such models can generally be understood as performing convolutions with G-steerable kernels.
- Score: 16.143012623830792
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Group equivariant convolutional networks (GCNNs) endow classical
convolutional networks with additional symmetry priors, which can lead to a
considerably improved performance. Recent advances in the theoretical
description of GCNNs revealed that such models can generally be understood as
performing convolutions with G-steerable kernels, that is, kernels that satisfy
an equivariance constraint themselves. While the G-steerability constraint has
been derived, it has to date only been solved for specific use cases - a
general characterization of G-steerable kernel spaces is still missing. This
work provides such a characterization for the practically relevant case of G
being any compact group. Our investigation is motivated by a striking analogy
between the constraints underlying steerable kernels on the one hand and
spherical tensor operators from quantum mechanics on the other hand. By
generalizing the famous Wigner-Eckart theorem for spherical tensor operators,
we prove that steerable kernel spaces are fully understood and parameterized in
terms of 1) generalized reduced matrix elements, 2) Clebsch-Gordan
coefficients, and 3) harmonic basis functions on homogeneous spaces.
Related papers
- C$^3$DG: Conditional Domain Generalization for Hyperspectral Imagery Classification with Convergence and Constrained-risk Theories [23.21421412818663]
Hyperspectral imagery (HSI) classification may suffer the challenge of hyperspectral-monospectra.
Joint spatial-spectral feature extraction is a popular solution for the problem.
We propose a Convergence and Error-Constrained Conditional Domain Generalization method.
arXiv Detail & Related papers (2024-07-04T18:03:45Z) - Self-Attention through Kernel-Eigen Pair Sparse Variational Gaussian Processes [20.023544206079304]
We propose Kernel-Eigen Pair Sparse Variational Gaussian Processes (KEP-SVGP) for building uncertainty-aware self-attention.
Experiments verify our excellent performances and efficiency on in-distribution, distribution-shift and out-of-distribution benchmarks.
arXiv Detail & Related papers (2024-02-02T15:05:13Z) - Lie Group Decompositions for Equivariant Neural Networks [12.139222986297261]
We show how convolution kernels can be parametrized to build models equivariant with respect to affine transformations.
We evaluate the robustness and out-of-distribution generalisation capability of our model on the benchmark affine-invariant classification task.
arXiv Detail & Related papers (2023-10-17T16:04:33Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Controlling the Complexity and Lipschitz Constant improves polynomial
nets [55.121200972539114]
We derive new complexity bounds for the set of Coupled CP-Decomposition (CCP) and Nested Coupled CP-decomposition (NCP) models of Polynomial Nets.
We propose a principled regularization scheme that we evaluate experimentally in six datasets and show that it improves the accuracy as well as the robustness of the models to adversarial perturbations.
arXiv Detail & Related papers (2022-02-10T14:54:29Z) - Exploiting Redundancy: Separable Group Convolutional Networks on Lie
Groups [14.029933823101084]
Group convolutional neural networks (G-CNNs) have been shown to increase parameter efficiency and model accuracy.
In this work, we investigate the properties of representations learned by regular G-CNNs, and show considerable parameter redundancy in group convolution kernels.
We introduce convolution kernels that are separable over the subgroup and channel dimensions.
arXiv Detail & Related papers (2021-10-25T15:56:53Z) - Coordinate Independent Convolutional Networks -- Isometry and Gauge
Equivariant Convolutions on Riemannian Manifolds [70.32518963244466]
A major complication in comparison to flat spaces is that it is unclear in which alignment a convolution kernel should be applied on a manifold.
We argue that the particular choice of coordinatization should not affect a network's inference -- it should be coordinate independent.
A simultaneous demand for coordinate independence and weight sharing is shown to result in a requirement on the network to be equivariant.
arXiv Detail & Related papers (2021-06-10T19:54:19Z) - Search for Efficient Formulations for Hamiltonian Simulation of
non-Abelian Lattice Gauge Theories [0.0]
Hamiltonian formulation of lattice gauge theories (LGTs) is the most natural framework for the purpose of quantum simulation.
It remains an important task to identify the most accurate, while computationally economic, Hamiltonian formulation(s) in such theories.
This paper is a first step toward addressing this question in the case of non-Abelian LGTs.
arXiv Detail & Related papers (2020-09-24T16:44:39Z) - Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning
via Gaussian Processes [144.6048446370369]
Graph convolutional neural networks(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification.
We propose a GP regression model via GCNs(GPGC) for graph-based semi-supervised learning.
We conduct extensive experiments to evaluate GPGC and demonstrate that it outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-02-26T10:02:32Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.