Safe Perception-Based Control under Stochastic Sensor Uncertainty using
Conformal Prediction
- URL: http://arxiv.org/abs/2304.00194v2
- Date: Fri, 25 Aug 2023 18:58:06 GMT
- Title: Safe Perception-Based Control under Stochastic Sensor Uncertainty using
Conformal Prediction
- Authors: Shuo Yang, George J. Pappas, Rahul Mangharam, and Lars Lindemann
- Abstract summary: We propose a perception-based control framework that quantifies estimation uncertainty of perception maps.
We also integrate these uncertainty representations into the control design.
We demonstrate the effectiveness of our proposed perception-based controller for a LiDAR-enabled F1/10th car.
- Score: 27.515056747751053
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We consider perception-based control using state estimates that are obtained
from high-dimensional sensor measurements via learning-enabled perception maps.
However, these perception maps are not perfect and result in state estimation
errors that can lead to unsafe system behavior. Stochastic sensor noise can
make matters worse and result in estimation errors that follow unknown
distributions. We propose a perception-based control framework that i)
quantifies estimation uncertainty of perception maps, and ii) integrates these
uncertainty representations into the control design. To do so, we use conformal
prediction to compute valid state estimation regions, which are sets that
contain the unknown state with high probability. We then devise a sampled-data
controller for continuous-time systems based on the notion of measurement
robust control barrier functions. Our controller uses idea from self-triggered
control and enables us to avoid using stochastic calculus. Our framework is
agnostic to the choice of the perception map, independent of the noise
distribution, and to the best of our knowledge the first to provide
probabilistic safety guarantees in such a setting. We demonstrate the
effectiveness of our proposed perception-based controller for a LiDAR-enabled
F1/10th car.
Related papers
- Automatic AI controller that can drive with confidence: steering vehicle with uncertainty knowledge [3.131134048419781]
This research focuses on the development of a vehicle's lateral control system using a machine learning framework.
We employ a Bayesian Neural Network (BNN), a probabilistic learning model, to address uncertainty quantification.
By establishing a confidence threshold, we can trigger manual intervention, ensuring that control is relinquished from the algorithm when it operates outside of safe parameters.
arXiv Detail & Related papers (2024-04-24T23:22:37Z) - Robust Control for Dynamical Systems With Non-Gaussian Noise via Formal
Abstractions [59.605246463200736]
We present a novel controller synthesis method that does not rely on any explicit representation of the noise distributions.
First, we abstract the continuous control system into a finite-state model that captures noise by probabilistic transitions between discrete states.
We use state-of-the-art verification techniques to provide guarantees on the interval Markov decision process and compute a controller for which these guarantees carry over to the original control system.
arXiv Detail & Related papers (2023-01-04T10:40:30Z) - Safe Output Feedback Motion Planning from Images via Learned Perception
Modules and Contraction Theory [6.950510860295866]
We present a class of uncertain control-affine nonlinear systems which guarantees runtime safety and goal reachability.
We train a perception system that seeks to invert a subset of the state from an observation, and estimate an upper bound on the perception error.
Next, we use contraction theory to design a stabilizing state feedback controller and a convergent dynamic state observer.
We derive a bound on the trajectory tracking error when this controller is subjected to errors in the dynamics and incorrect state estimates.
arXiv Detail & Related papers (2022-06-14T02:03:27Z) - Trajectory Forecasting from Detection with Uncertainty-Aware Motion
Encoding [121.66374635092097]
Trajectories obtained from object detection and tracking are inevitably noisy.
We propose a trajectory predictor directly based on detection results without relying on explicitly formed trajectories.
arXiv Detail & Related papers (2022-02-03T09:09:56Z) - Learning Robust Output Control Barrier Functions from Safe Expert Demonstrations [50.37808220291108]
This paper addresses learning safe output feedback control laws from partial observations of expert demonstrations.
We first propose robust output control barrier functions (ROCBFs) as a means to guarantee safety.
We then formulate an optimization problem to learn ROCBFs from expert demonstrations that exhibit safe system behavior.
arXiv Detail & Related papers (2021-11-18T23:21:00Z) - Sampling-Based Robust Control of Autonomous Systems with Non-Gaussian
Noise [59.47042225257565]
We present a novel planning method that does not rely on any explicit representation of the noise distributions.
First, we abstract the continuous system into a discrete-state model that captures noise by probabilistic transitions between states.
We capture these bounds in the transition probability intervals of a so-called interval Markov decision process (iMDP)
arXiv Detail & Related papers (2021-10-25T06:18:55Z) - CertainNet: Sampling-free Uncertainty Estimation for Object Detection [65.28989536741658]
Estimating the uncertainty of a neural network plays a fundamental role in safety-critical settings.
In this work, we propose a novel sampling-free uncertainty estimation method for object detection.
We call it CertainNet, and it is the first to provide separate uncertainties for each output signal: objectness, class, location and size.
arXiv Detail & Related papers (2021-10-04T17:59:31Z) - Learning Uncertainty For Safety-Oriented Semantic Segmentation In
Autonomous Driving [77.39239190539871]
We show how uncertainty estimation can be leveraged to enable safety critical image segmentation in autonomous driving.
We introduce a new uncertainty measure based on disagreeing predictions as measured by a dissimilarity function.
We show experimentally that our proposed approach is much less computationally intensive at inference time than competing methods.
arXiv Detail & Related papers (2021-05-28T09:23:05Z) - Guaranteeing Safety of Learned Perception Modules via Measurement-Robust
Control Barrier Functions [43.4346415363429]
We seek to unify techniques from control theory and machine learning to synthesize controllers that achieve safety.
We define a Measurement-Robust Control Barrier Function (MR-CBF) as a tool for determining safe control inputs.
We demonstrate the efficacy of MR-CBFs in achieving safety with measurement model uncertainty on a simulated Segway system.
arXiv Detail & Related papers (2020-10-30T00:19:01Z) - Robust Learning-Based Control via Bootstrapped Multiplicative Noise [0.0]
We propose a robust adaptive control algorithm that explicitly incorporates such non-asymptotic uncertainties into the control design.
A key advantage of the proposed approach is that the system identification and robust control design procedures both use uncertainty representations.
arXiv Detail & Related papers (2020-02-24T04:12:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.