Integrating Symmetry into Differentiable Planning with Steerable
Convolutions
- URL: http://arxiv.org/abs/2206.03674v3
- Date: Mon, 1 May 2023 06:51:07 GMT
- Title: Integrating Symmetry into Differentiable Planning with Steerable
Convolutions
- Authors: Linfeng Zhao, Xupeng Zhu, Lingzhi Kong, Robin Walters, Lawson L.S.
Wong
- Abstract summary: Motivated by equivariant convolution networks, we treat the path planning problem as textitsignals over grids.
We show that value iteration in this case is a linear equivariant operator, which is a (steerable) convolution.
Our implementation is based on VINs and uses steerable convolution networks to incorporate symmetry.
- Score: 5.916280909373456
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study how group symmetry helps improve data efficiency and generalization
for end-to-end differentiable planning algorithms when symmetry appears in
decision-making tasks. Motivated by equivariant convolution networks, we treat
the path planning problem as \textit{signals} over grids. We show that value
iteration in this case is a linear equivariant operator, which is a (steerable)
convolution. This extends Value Iteration Networks (VINs) on using
convolutional networks for path planning with additional rotation and
reflection symmetry. Our implementation is based on VINs and uses steerable
convolution networks to incorporate symmetry. The experiments are performed on
four tasks: 2D navigation, visual navigation, and 2 degrees of freedom (2DOFs)
configuration space and workspace manipulation. Our symmetric planning
algorithms improve training efficiency and generalization by large margins
compared to non-equivariant counterparts, VIN and GPPN.
Related papers
- Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - SBDet: A Symmetry-Breaking Object Detector via Relaxed Rotation-Equivariance [26.05910177212846]
Group Equivariant Convolution (GConv) empowers models to explore symmetries hidden in visual data, improving their performance.
Traditional GConv methods are limited by the strict operation rules in the group space, making it difficult to adapt to Symmetry-Breaking or non-rigid transformations.
We propose a novel Relaxed Rotation-Equivariant Network (R2Net) as the backbone and further develop the Symmetry-Breaking Object Detector (SBDet) for 2D object detection built upon it.
arXiv Detail & Related papers (2024-08-21T16:32:03Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Equivariant Ensembles and Regularization for Reinforcement Learning in Map-based Path Planning [5.69473229553916]
This paper proposes a method to construct equivariant policies and invariant value functions without specialized neural network components.
We show how equivariant ensembles and regularization benefit sample efficiency and performance.
arXiv Detail & Related papers (2024-03-19T16:01:25Z) - Learning Layer-wise Equivariances Automatically using Gradients [66.81218780702125]
Convolutions encode equivariance symmetries into neural networks leading to better generalisation performance.
symmetries provide fixed hard constraints on the functions a network can represent, need to be specified in advance, and can not be adapted.
Our goal is to allow flexible symmetry constraints that can automatically be learned from data using gradients.
arXiv Detail & Related papers (2023-10-09T20:22:43Z) - Can Euclidean Symmetry be Leveraged in Reinforcement Learning and
Planning? [5.943193860994729]
In robotic tasks, changes in reference frames typically do not influence the underlying physical properties of the system.
We put forth a theory on that unify prior work on discrete and continuous symmetry in reinforcement learning, planning, and optimal control.
arXiv Detail & Related papers (2023-07-17T04:01:48Z) - Deep Neural Networks with Efficient Guaranteed Invariances [77.99182201815763]
We address the problem of improving the performance and in particular the sample complexity of deep neural networks.
Group-equivariant convolutions are a popular approach to obtain equivariant representations.
We propose a multi-stream architecture, where each stream is invariant to a different transformation.
arXiv Detail & Related papers (2023-03-02T20:44:45Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Improving the Sample-Complexity of Deep Classification Networks with
Invariant Integration [77.99182201815763]
Leveraging prior knowledge on intraclass variance due to transformations is a powerful method to improve the sample complexity of deep neural networks.
We propose a novel monomial selection algorithm based on pruning methods to allow an application to more complex problems.
We demonstrate the improved sample complexity on the Rotated-MNIST, SVHN and CIFAR-10 datasets.
arXiv Detail & Related papers (2022-02-08T16:16:11Z) - Encoding Involutory Invariance in Neural Networks [1.6371837018687636]
In certain situations, Neural Networks (NN) are trained upon data that obey underlying physical symmetries.
In this work, we explore a special kind of symmetry where functions are invariant with respect to involutory linear/affine transformations up to parity.
Numerical experiments indicate that the proposed models outperform baseline networks while respecting the imposed symmetry.
An adaption of our technique to convolutional NN classification tasks for datasets with inherent horizontal/vertical reflection symmetry has also been proposed.
arXiv Detail & Related papers (2021-06-07T16:07:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.