Approximately Equivariant Graph Networks
- URL: http://arxiv.org/abs/2308.10436v3
- Date: Fri, 17 Nov 2023 16:29:49 GMT
- Title: Approximately Equivariant Graph Networks
- Authors: Ningyuan Huang, Ron Levie, Soledad Villar
- Abstract summary: Graph neural networks (GNNs) are commonly described as being permutation equivariant with respect to node relabeling in the graph.
We focus on the active symmetries of GNNs, by considering a learning setting where signals are supported on a fixed graph.
- Score: 14.312312714312046
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) are commonly described as being permutation
equivariant with respect to node relabeling in the graph. This symmetry of GNNs
is often compared to the translation equivariance of Euclidean convolution
neural networks (CNNs). However, these two symmetries are fundamentally
different: The translation equivariance of CNNs corresponds to symmetries of
the fixed domain acting on the image signals (sometimes known as active
symmetries), whereas in GNNs any permutation acts on both the graph signals and
the graph domain (sometimes described as passive symmetries). In this work, we
focus on the active symmetries of GNNs, by considering a learning setting where
signals are supported on a fixed graph. In this case, the natural symmetries of
GNNs are the automorphisms of the graph. Since real-world graphs tend to be
asymmetric, we relax the notion of symmetries by formalizing approximate
symmetries via graph coarsening. We present a bias-variance formula that
quantifies the tradeoff between the loss in expressivity and the gain in the
regularity of the learned estimator, depending on the chosen symmetry group. To
illustrate our approach, we conduct extensive experiments on image inpainting,
traffic flow prediction, and human pose estimation with different choices of
symmetries. We show theoretically and empirically that the best generalization
performance can be achieved by choosing a suitably larger group than the graph
automorphism, but smaller than the permutation group.
Related papers
- Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - Relaxing Continuous Constraints of Equivariant Graph Neural Networks for Physical Dynamics Learning [39.25135680793105]
We propose a general Discrete Equivariant Graph Neural Network (DEGNN) that guarantees equivariance to a given discrete point group.
Specifically, we show that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings.
We show that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.
arXiv Detail & Related papers (2024-06-24T03:37:51Z) - Equivariant Machine Learning on Graphs with Nonlinear Spectral Filters [12.709930975472698]
We consider the graph functional shifts as the symmetry group: the unitary operators that commute with the graph shift operator.
We propose nonlinear spectral filters (NLSFs) that are fully equivariant to graph functional shifts.
We demonstrate the superior performance of NLSFs over existing spectral GNNs in node and graph classification benchmarks.
arXiv Detail & Related papers (2024-06-03T12:07:01Z) - Symmetry Breaking and Equivariant Neural Networks [17.740760773905986]
We introduce a novel notion of'relaxed equiinjection'
We show how to incorporate this relaxation into equivariant multilayer perceptronrons (E-MLPs)
The relevance of symmetry breaking is then discussed in various application domains.
arXiv Detail & Related papers (2023-12-14T15:06:48Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Generative Adversarial Symmetry Discovery [19.098785309131458]
LieGAN represents symmetry as interpretable Lie algebra basis and can discover various symmetries.
The learned symmetry can also be readily used in several existing equivariant neural networks to improve accuracy and generalization in prediction.
arXiv Detail & Related papers (2023-02-01T04:28:36Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - Frame Averaging for Invariant and Equivariant Network Design [50.87023773850824]
We introduce Frame Averaging (FA), a framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
We show that FA-based models have maximal expressive power in a broad setting.
We propose a new class of universal Graph Neural Networks (GNNs), universal Euclidean motion invariant point cloud networks, and Euclidean motion invariant Message Passing (MP) GNNs.
arXiv Detail & Related papers (2021-10-07T11:05:23Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Detecting Symmetries with Neural Networks [0.0]
We make extensive use of the structure in the embedding layer of the neural network.
We identify whether a symmetry is present and to identify orbits of the symmetry in the input.
For this example we present a novel data representation in terms of graphs.
arXiv Detail & Related papers (2020-03-30T17:58:24Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.