Approximating Constraint Manifolds Using Generative Models for
Sampling-Based Constrained Motion Planning
- URL: http://arxiv.org/abs/2204.06791v1
- Date: Thu, 14 Apr 2022 07:08:30 GMT
- Title: Approximating Constraint Manifolds Using Generative Models for
Sampling-Based Constrained Motion Planning
- Authors: Cihan Acar, Keng Peng Tee
- Abstract summary: This paper presents a learning-based sampling strategy for constrained motion planning problems.
We use Conditional Variversaational Autoencoder (CVAE) and Conditional Generative Adrial Net (CGAN) to generate constraint-satisfying sample configurations.
We evaluate the efficiency of these two generative models in terms of their sampling accuracy and coverage of sampling distribution.
- Score: 8.924344714683814
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sampling-based motion planning under task constraints is challenging because
the null-measure constraint manifold in the configuration space makes rejection
sampling extremely inefficient, if not impossible. This paper presents a
learning-based sampling strategy for constrained motion planning problems. We
investigate the use of two well-known deep generative models, the Conditional
Variational Autoencoder (CVAE) and the Conditional Generative Adversarial Net
(CGAN), to generate constraint-satisfying sample configurations. Instead of
precomputed graphs, we use generative models conditioned on constraint
parameters for approximating the constraint manifold. This approach allows for
the efficient drawing of constraint-satisfying samples online without any need
for modification of available sampling-based motion planning algorithms. We
evaluate the efficiency of these two generative models in terms of their
sampling accuracy and coverage of sampling distribution. Simulations and
experiments are also conducted for different constraint tasks on two robotic
platforms.
Related papers
- Plug-and-Play Controllable Generation for Discrete Masked Models [27.416952690340903]
This article makes discrete masked models for the generative modeling of discrete data controllable.
We propose a novel plug-and-play framework based on importance sampling that bypasses the need for training a conditional score.
Our framework is agnostic to the choice of control criteria, requires no gradient information, and is well-suited for tasks such as posterior sampling, Bayesian inverse problems, and constrained generation.
arXiv Detail & Related papers (2024-10-03T02:00:40Z) - Gradient and Uncertainty Enhanced Sequential Sampling for Global Fit [0.0]
This paper proposes a new sampling strategy for global fit called Gradient and Uncertainty Enhanced Sequential Sampling (GUESS)
We show that GUESS achieved on average the highest sample efficiency compared to other surrogate-based strategies on the tested examples.
arXiv Detail & Related papers (2023-09-29T19:49:39Z) - Conditional Sampling of Variational Autoencoders via Iterated
Approximate Ancestral Sampling [7.357511266926065]
Conditional sampling of variational autoencoders (VAEs) is needed in various applications, such as missing data imputation, but is computationally intractable.
A principled choice forally exact conditional sampling is Metropolis-within-Gibbs (MWG)
arXiv Detail & Related papers (2023-08-17T16:08:18Z) - DiMSam: Diffusion Models as Samplers for Task and Motion Planning under Partial Observability [58.75803543245372]
Task and Motion Planning (TAMP) approaches are suited for planning multi-step autonomous robot manipulation.
We propose to overcome these limitations by composing diffusion models using a TAMP system.
We show how the combination of classical TAMP, generative modeling, and latent embedding enables multi-step constraint-based reasoning.
arXiv Detail & Related papers (2023-06-22T20:40:24Z) - Protein Design with Guided Discrete Diffusion [67.06148688398677]
A popular approach to protein design is to combine a generative model with a discriminative model for conditional sampling.
We propose diffusioN Optimized Sampling (NOS), a guidance method for discrete diffusion models.
NOS makes it possible to perform design directly in sequence space, circumventing significant limitations of structure-based methods.
arXiv Detail & Related papers (2023-05-31T16:31:24Z) - Conditional Denoising Diffusion for Sequential Recommendation [62.127862728308045]
Two prominent generative models, Generative Adversarial Networks (GANs) and Variational AutoEncoders (VAEs)
GANs suffer from unstable optimization, while VAEs are prone to posterior collapse and over-smoothed generations.
We present a conditional denoising diffusion model, which includes a sequence encoder, a cross-attentive denoising decoder, and a step-wise diffuser.
arXiv Detail & Related papers (2023-04-22T15:32:59Z) - A Data-Driven State Aggregation Approach for Dynamic Discrete Choice
Models [7.7347261505610865]
We present a novel algorithm that provides a data-driven method for selecting and aggregating states.
The proposed two-stage approach mitigates the curse of dimensionality by reducing the problem dimension.
We demonstrate the empirical performance of the algorithm in two classic dynamic discrete choice estimation applications.
arXiv Detail & Related papers (2023-04-11T01:07:24Z) - Toward Certified Robustness Against Real-World Distribution Shifts [65.66374339500025]
We train a generative model to learn perturbations from data and define specifications with respect to the output of the learned model.
A unique challenge arising from this setting is that existing verifiers cannot tightly approximate sigmoid activations.
We propose a general meta-algorithm for handling sigmoid activations which leverages classical notions of counter-example-guided abstraction refinement.
arXiv Detail & Related papers (2022-06-08T04:09:13Z) - Calibrating Over-Parametrized Simulation Models: A Framework via
Eligibility Set [3.862247454265944]
We develop a framework to develop calibration schemes that satisfy rigorous frequentist statistical guarantees.
We demonstrate our methodology on several numerical examples, including an application to calibration of a limit order book market simulator.
arXiv Detail & Related papers (2021-05-27T00:59:29Z) - Robust Optimal Transport with Applications in Generative Modeling and
Domain Adaptation [120.69747175899421]
Optimal Transport (OT) distances such as Wasserstein have been used in several areas such as GANs and domain adaptation.
We propose a computationally-efficient dual form of the robust OT optimization that is amenable to modern deep learning applications.
Our approach can train state-of-the-art GAN models on noisy datasets corrupted with outlier distributions.
arXiv Detail & Related papers (2020-10-12T17:13:40Z) - Control as Hybrid Inference [62.997667081978825]
We present an implementation of CHI which naturally mediates the balance between iterative and amortised inference.
We verify the scalability of our algorithm on a continuous control benchmark, demonstrating that it outperforms strong model-free and model-based baselines.
arXiv Detail & Related papers (2020-07-11T19:44:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.