Can Euclidean Symmetry be Leveraged in Reinforcement Learning and
Planning?
- URL: http://arxiv.org/abs/2307.08226v1
- Date: Mon, 17 Jul 2023 04:01:48 GMT
- Title: Can Euclidean Symmetry be Leveraged in Reinforcement Learning and
Planning?
- Authors: Linfeng Zhao, Owen Howell, Jung Yeon Park, Xupeng Zhu, Robin Walters,
and Lawson L.S. Wong
- Abstract summary: In robotic tasks, changes in reference frames typically do not influence the underlying physical properties of the system.
We put forth a theory on that unify prior work on discrete and continuous symmetry in reinforcement learning, planning, and optimal control.
- Score: 5.943193860994729
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In robotic tasks, changes in reference frames typically do not influence the
underlying physical properties of the system, which has been known as
invariance of physical laws.These changes, which preserve distance, encompass
isometric transformations such as translations, rotations, and reflections,
collectively known as the Euclidean group. In this work, we delve into the
design of improved learning algorithms for reinforcement learning and planning
tasks that possess Euclidean group symmetry. We put forth a theory on that
unify prior work on discrete and continuous symmetry in reinforcement learning,
planning, and optimal control. Algorithm side, we further extend the 2D path
planning with value-based planning to continuous MDPs and propose a pipeline
for constructing equivariant sampling-based planning algorithms. Our work is
substantiated with empirical evidence and illustrated through examples that
explain the benefits of equivariance to Euclidean symmetry in tackling natural
control problems.
Related papers
- Geometric Understanding of Discriminability and Transferability for Visual Domain Adaptation [27.326817457760725]
Invariant representation learning for unsupervised domain adaptation (UDA) has made significant advances in computer vision and pattern recognition communities.
Recently, empirical connections between transferability and discriminability have received increasing attention.
In this work, we systematically analyze the essentials of transferability and discriminability from the geometric perspective.
arXiv Detail & Related papers (2024-06-24T13:31:08Z) - Equivariant Ensembles and Regularization for Reinforcement Learning in Map-based Path Planning [5.69473229553916]
This paper proposes a method to construct equivariant policies and invariant value functions without specialized neural network components.
We show how equivariant ensembles and regularization benefit sample efficiency and performance.
arXiv Detail & Related papers (2024-03-19T16:01:25Z) - Learning Layer-wise Equivariances Automatically using Gradients [66.81218780702125]
Convolutions encode equivariance symmetries into neural networks leading to better generalisation performance.
symmetries provide fixed hard constraints on the functions a network can represent, need to be specified in advance, and can not be adapted.
Our goal is to allow flexible symmetry constraints that can automatically be learned from data using gradients.
arXiv Detail & Related papers (2023-10-09T20:22:43Z) - Symmetry Preservation in Hamiltonian Systems: Simulation and Learning [0.9208007322096532]
This work presents a general geometric framework for simulating and learning the dynamics of Hamiltonian systems.
We propose to simulate and learn the mappings of interest through the construction of $G$-invariant Lagrangian submanifolds.
Our designs leverage pivotal techniques and concepts in symplectic geometry and geometric mechanics.
arXiv Detail & Related papers (2023-08-30T21:34:33Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Generative Adversarial Symmetry Discovery [19.098785309131458]
LieGAN represents symmetry as interpretable Lie algebra basis and can discover various symmetries.
The learned symmetry can also be readily used in several existing equivariant neural networks to improve accuracy and generalization in prediction.
arXiv Detail & Related papers (2023-02-01T04:28:36Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Neural Bregman Divergences for Distance Learning [60.375385370556145]
We propose a new approach to learning arbitrary Bregman divergences in a differentiable manner via input convex neural networks.
We show that our method more faithfully learns divergences over a set of both new and previously studied tasks.
Our tests further extend to known asymmetric, but non-Bregman tasks, where our method still performs competitively despite misspecification.
arXiv Detail & Related papers (2022-06-09T20:53:15Z) - Integrating Symmetry into Differentiable Planning with Steerable
Convolutions [5.916280909373456]
Motivated by equivariant convolution networks, we treat the path planning problem as textitsignals over grids.
We show that value iteration in this case is a linear equivariant operator, which is a (steerable) convolution.
Our implementation is based on VINs and uses steerable convolution networks to incorporate symmetry.
arXiv Detail & Related papers (2022-06-08T04:58:48Z) - Geometric Methods for Sampling, Optimisation, Inference and Adaptive
Agents [102.42623636238399]
We identify fundamental geometric structures that underlie the problems of sampling, optimisation, inference and adaptive decision-making.
We derive algorithms that exploit these geometric structures to solve these problems efficiently.
arXiv Detail & Related papers (2022-03-20T16:23:17Z) - Meta-Learning Symmetries by Reparameterization [63.85144439337671]
We present a method for learning and encoding equivariances into networks by learning corresponding parameter sharing patterns from data.
Our experiments suggest that it can automatically learn to encode equivariances to common transformations used in image processing tasks.
arXiv Detail & Related papers (2020-07-06T17:59:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.