Discrete-Continuous Smoothing and Mapping
- URL: http://arxiv.org/abs/2204.11936v1
- Date: Mon, 25 Apr 2022 19:31:44 GMT
- Title: Discrete-Continuous Smoothing and Mapping
- Authors: Kevin J. Doherty and Ziqi Lu and Kurran Singh and John J. Leonard
- Abstract summary: We describe a general approach to smoothing and mapping with a class of discrete-continuous factor graphs commonly encountered in robotics applications.
We provide a library, DC-SAM, extending existing tools for optimization problems defined in terms of factor graphs to the setting of discrete-continuous models.
- Score: 8.90077503980675
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We describe a general approach to smoothing and mapping with a class of
discrete-continuous factor graphs commonly encountered in robotics
applications. While there are openly available tools providing flexible and
easy-to-use interfaces for specifying and solving optimization problems
formulated in terms of either discrete or continuous graphical models, at
present, no similarly general tools exist enabling the same functionality for
hybrid discrete-continuous problems. We aim to address this problem. In
particular, we provide a library, DC-SAM, extending existing tools for
optimization problems defined in terms of factor graphs to the setting of
discrete-continuous models. A key contribution of our work is a novel solver
for efficiently recovering approximate solutions to discrete-continuous
optimization problems. The key insight to our approach is that while joint
inference over continuous and discrete state spaces is often hard, many
commonly encountered discrete-continuous problems can naturally be split into a
"discrete part" and a "continuous part" that can individually be solved easily.
Leveraging this structure, we optimize discrete and continuous variables in an
alternating fashion. In consequence, our proposed work enables straightforward
representation of and approximate inference in discrete-continuous graphical
models. We also provide a method to recover the uncertainty in estimates of
both discrete and continuous variables. We demonstrate the versatility of our
approach through its application to three distinct robot perception
applications: point-cloud registration, robust pose graph optimization, and
object-based mapping and localization.
Related papers
- Randomized algorithms and PAC bounds for inverse reinforcement learning in continuous spaces [47.907236421762626]
This work studies discrete-time discounted Markov decision processes with continuous state and action spaces.
We first consider the case in which we have access to the entire expert policy and characterize the set of solutions to the inverse problem.
arXiv Detail & Related papers (2024-05-24T12:53:07Z) - It's All in the Mix: Wasserstein Machine Learning with Mixed Features [5.739657897440173]
We present a practically efficient algorithm to solve mixed-feature problems.
We demonstrate that our approach can significantly outperform existing methods that are to the presence of discrete features.
arXiv Detail & Related papers (2023-12-19T15:15:52Z) - Variational Annealing on Graphs for Combinatorial Optimization [7.378582040635655]
We show that an autoregressive approach which captures statistical dependencies among solution variables yields superior performance on many popular CO problems.
We introduce subgraph tokenization in which the configuration of a set of solution variables is represented by a single token.
arXiv Detail & Related papers (2023-11-23T18:56:51Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation [71.27989298860481]
We address the non- optimisation problem of finding a matrix on the Stiefel manifold that maximises a quadratic objective function.
We propose a simple yet effective sparsity-promoting algorithm for finding the dominant eigenspace matrix.
arXiv Detail & Related papers (2021-09-30T19:17:35Z) - Optimization on manifolds: A symplectic approach [127.54402681305629]
We propose a dissipative extension of Dirac's theory of constrained Hamiltonian systems as a general framework for solving optimization problems.
Our class of (accelerated) algorithms are not only simple and efficient but also applicable to a broad range of contexts.
arXiv Detail & Related papers (2021-07-23T13:43:34Z) - Joint Continuous and Discrete Model Selection via Submodularity [1.332560004325655]
In model selection problems for machine learning, the desire for a well-performing model with meaningful structure is typically expressed through a regularized optimization problem.
In many scenarios, however, numerically meaningful structure is specified in some discrete space leading to difficult non optimization problems.
We show how simple continuous or discrete constraints can also be handled for certain problem classes, motivated by robust optimization.
arXiv Detail & Related papers (2021-02-17T21:14:47Z) - Differentiable Causal Discovery from Interventional Data [141.41931444927184]
We propose a theoretically-grounded method based on neural networks that can leverage interventional data.
We show that our approach compares favorably to the state of the art in a variety of settings.
arXiv Detail & Related papers (2020-07-03T15:19:17Z) - Consistent Second-Order Conic Integer Programming for Learning Bayesian
Networks [2.7473982588529653]
We study the problem of learning the sparse DAG structure of a BN from continuous observational data.
The optimal solution to this mathematical program is known to have desirable statistical properties under certain conditions.
We propose a concrete early stopping criterion to terminate the branch-and-bound process in order to obtain a near-optimal solution.
arXiv Detail & Related papers (2020-05-29T00:13:15Z) - Learning with Differentiable Perturbed Optimizers [54.351317101356614]
We propose a systematic method to transform operations into operations that are differentiable and never locally constant.
Our approach relies on perturbeds, and can be used readily together with existing solvers.
We show how this framework can be connected to a family of losses developed in structured prediction, and give theoretical guarantees for their use in learning tasks.
arXiv Detail & Related papers (2020-02-20T11:11:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.