Safe Adaptive Cruise Control Under Perception Uncertainty: A Deep Ensemble and Conformal Tube Model Predictive Control Approach
- URL: http://arxiv.org/abs/2412.03792v1
- Date: Thu, 05 Dec 2024 01:01:53 GMT
- Title: Safe Adaptive Cruise Control Under Perception Uncertainty: A Deep Ensemble and Conformal Tube Model Predictive Control Approach
- Authors: Xiao Li, Anouck Girard, Ilya Kolmanovsky,
- Abstract summary: This paper considers a Deep Ensemble of Deep Neural Network regressors integrated with Conformal Prediction to predict and quantify uncertainties.<n>An adaptive cruise controller using Conformal Tube Model Predictive Control is designed to ensure probabilistic safety.<n> Evaluations with a high-fidelity simulator demonstrate the algorithm's effectiveness in speed tracking and safe distance maintaining.
- Score: 5.740554452832947
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Autonomous driving heavily relies on perception systems to interpret the environment for decision-making. To enhance robustness in these safety critical applications, this paper considers a Deep Ensemble of Deep Neural Network regressors integrated with Conformal Prediction to predict and quantify uncertainties. In the Adaptive Cruise Control setting, the proposed method performs state and uncertainty estimation from RGB images, informing the downstream controller of the DNN perception uncertainties. An adaptive cruise controller using Conformal Tube Model Predictive Control is designed to ensure probabilistic safety. Evaluations with a high-fidelity simulator demonstrate the algorithm's effectiveness in speed tracking and safe distance maintaining, including in Out-Of-Distribution scenarios.
Related papers
- Designing Control Barrier Function via Probabilistic Enumeration for Safe Reinforcement Learning Navigation [55.02966123945644]
We propose a hierarchical control framework leveraging neural network verification techniques to design control barrier functions (CBFs) and policy correction mechanisms.
Our approach relies on probabilistic enumeration to identify unsafe regions of operation, which are then used to construct a safe CBF-based control layer.
These experiments demonstrate the ability of the proposed solution to correct unsafe actions while preserving efficient navigation behavior.
arXiv Detail & Related papers (2025-04-30T13:47:25Z) - Optimal Parameter Adaptation for Safety-Critical Control via Safe Barrier Bayesian Optimization [27.36423499121502]
Control barrier function (CBF) method, a promising solution for safety-critical control, poses a new challenge of enhancing control performance.
We propose a novel framework combining the CBF method with Bayesian optimization (BO) to optimize the safe control performance.
arXiv Detail & Related papers (2025-03-25T04:56:17Z) - Autonomous Driving With Perception Uncertainties: Deep-Ensemble Based Adaptive Cruise Control [6.492311803411367]
Advanced perception systems utilizing black-box Deep Neural Networks (DNNs) demonstrate human-like comprehension.
Unpredictable behavior and lack of interpretability may hinder their deployment in safety critical scenarios.
arXiv Detail & Related papers (2024-03-22T19:04:58Z) - Learning the Uncertainty Sets for Control Dynamics via Set Membership: A Non-Asymptotic Analysis [18.110158316883403]
This paper focuses on set membership estimation (SME) for unknown linear systems.
We provide the first convergence rate bounds for SME and discuss variations of SME under relaxed assumptions.
We also provide numerical results demonstrating SME's practical promise.
arXiv Detail & Related papers (2023-09-26T03:58:06Z) - Safe Perception-Based Control under Stochastic Sensor Uncertainty using
Conformal Prediction [27.515056747751053]
We propose a perception-based control framework that quantifies estimation uncertainty of perception maps.
We also integrate these uncertainty representations into the control design.
We demonstrate the effectiveness of our proposed perception-based controller for a LiDAR-enabled F1/10th car.
arXiv Detail & Related papers (2023-04-01T01:45:53Z) - Meta-Learning Priors for Safe Bayesian Optimization [72.8349503901712]
We build on a meta-learning algorithm, F-PACOH, capable of providing reliable uncertainty quantification in settings of data scarcity.
As core contribution, we develop a novel framework for choosing safety-compliant priors in a data-riven manner.
On benchmark functions and a high-precision motion system, we demonstrate that our meta-learned priors accelerate the convergence of safe BO approaches.
arXiv Detail & Related papers (2022-10-03T08:38:38Z) - Recursively Feasible Probabilistic Safe Online Learning with Control Barrier Functions [60.26921219698514]
We introduce a model-uncertainty-aware reformulation of CBF-based safety-critical controllers.
We then present the pointwise feasibility conditions of the resulting safety controller.
We use these conditions to devise an event-triggered online data collection strategy.
arXiv Detail & Related papers (2022-08-23T05:02:09Z) - Learning Robust Output Control Barrier Functions from Safe Expert Demonstrations [50.37808220291108]
This paper addresses learning safe output feedback control laws from partial observations of expert demonstrations.
We first propose robust output control barrier functions (ROCBFs) as a means to guarantee safety.
We then formulate an optimization problem to learn ROCBFs from expert demonstrations that exhibit safe system behavior.
arXiv Detail & Related papers (2021-11-18T23:21:00Z) - Adaptive control of a mechatronic system using constrained residual
reinforcement learning [0.0]
We propose a simple, practical and intuitive approach to improve the performance of a conventional controller in uncertain environments.
Our approach is motivated by the observation that conventional controllers in industrial motion control value robustness over adaptivity to deal with different operating conditions.
arXiv Detail & Related papers (2021-10-06T08:13:05Z) - Pointwise Feasibility of Gaussian Process-based Safety-Critical Control
under Model Uncertainty [77.18483084440182]
Control Barrier Functions (CBFs) and Control Lyapunov Functions (CLFs) are popular tools for enforcing safety and stability of a controlled system, respectively.
We present a Gaussian Process (GP)-based approach to tackle the problem of model uncertainty in safety-critical controllers that use CBFs and CLFs.
arXiv Detail & Related papers (2021-06-13T23:08:49Z) - Learning Control Barrier Functions from Expert Demonstrations [69.23675822701357]
We propose a learning based approach to safe controller synthesis based on control barrier functions (CBFs)
We analyze an optimization-based approach to learning a CBF that enjoys provable safety guarantees under suitable Lipschitz assumptions on the underlying dynamical system.
To the best of our knowledge, these are the first results that learn provably safe control barrier functions from data.
arXiv Detail & Related papers (2020-04-07T12:29:06Z) - Robust Learning-Based Control via Bootstrapped Multiplicative Noise [0.0]
We propose a robust adaptive control algorithm that explicitly incorporates such non-asymptotic uncertainties into the control design.
A key advantage of the proposed approach is that the system identification and robust control design procedures both use uncertainty representations.
arXiv Detail & Related papers (2020-02-24T04:12:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.