Parallelised Diffeomorphic Sampling-based Motion Planning
- URL: http://arxiv.org/abs/2108.11775v1
- Date: Thu, 26 Aug 2021 13:15:11 GMT
- Title: Parallelised Diffeomorphic Sampling-based Motion Planning
- Authors: Tin Lai, Weiming Zhi, Tucker Hermans and Fabio Ramos
- Abstract summary: We propose Parallelised Diffeomorphic Sampling-based Motion Planning (PDMP)
PDMP transforms sampling distributions of sampling-based motion planners, in a manner akin to normalising flows.
PDMP is able to leverage gradient information of costs, to inject specifications, in a manner similar to optimisation-based motion planning methods.
- Score: 30.310891362316863
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose Parallelised Diffeomorphic Sampling-based Motion Planning (PDMP).
PDMP is a novel parallelised framework that uses bijective and differentiable
mappings, or diffeomorphisms, to transform sampling distributions of
sampling-based motion planners, in a manner akin to normalising flows. Unlike
normalising flow models which use invertible neural network structures to
represent these diffeomorphisms, we develop them from gradient information of
desired costs, and encode desirable behaviour, such as obstacle avoidance.
These transformed sampling distributions can then be used for sampling-based
motion planning. A particular example is when we wish to imbue the sampling
distribution with knowledge of the environment geometry, such that drawn
samples are less prone to be in collisions. To this end, we propose to learn a
continuous occupancy representation from environment occupancy data, such that
gradients of the representation defines a valid diffeomorphism and is amenable
to fast parallel evaluation. We use this to "morph" the sampling distribution
to draw far fewer collision-prone samples. PDMP is able to leverage gradient
information of costs, to inject specifications, in a manner similar to
optimisation-based motion planning methods, but relies on drawing from a
sampling distribution, retaining the tendency to find more global solutions,
thereby bridging the gap between trajectory optimisation and sampling-based
planning methods.
Related papers
- Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Sampling for Model Predictive Trajectory Planning in Autonomous Driving using Normalizing Flows [1.2972104025246092]
This paper investigates several sampling approaches for trajectory generation.
normalizing flows originating from the field of variational inference are considered.
Learning-based normalizing flow models are trained for a more efficient exploration of the input domain.
arXiv Detail & Related papers (2024-04-15T10:45:12Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Arbitrary Distributions Mapping via SyMOT-Flow: A Flow-based Approach Integrating Maximum Mean Discrepancy and Optimal Transport [2.7309692684728617]
We introduce a novel model called SyMOT-Flow that trains an invertible transformation by minimizing the symmetric maximum mean discrepancy between samples from two unknown distributions.
The resulting transformation leads to more stable and accurate sample generation.
arXiv Detail & Related papers (2023-08-26T08:39:16Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Example-Based Sampling with Diffusion Models [7.943023838493658]
diffusion models for image generation could be appropriate for learning how to generate point sets from examples.
We propose a generic way to produce 2-d point sets imitating existing samplers from observed point sets using a diffusion model.
We demonstrate how the differentiability of our approach can be used to optimize point sets to enforce properties.
arXiv Detail & Related papers (2023-02-10T08:35:17Z) - From Points to Functions: Infinite-dimensional Representations in
Diffusion Models [23.916417852496608]
Diffusion-based generative models learn to iteratively transfer unstructured noise to a complex target distribution.
We show that a combination of information content from different time steps gives a strictly better representation for the downstream task.
arXiv Detail & Related papers (2022-10-25T05:30:53Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.