Fault Tolerant Neural Control Barrier Functions for Robotic Systems
under Sensor Faults and Attacks
- URL: http://arxiv.org/abs/2402.18677v1
- Date: Wed, 28 Feb 2024 19:44:19 GMT
- Title: Fault Tolerant Neural Control Barrier Functions for Robotic Systems
under Sensor Faults and Attacks
- Authors: Hongchao Zhang, Luyao Niu, Andrew Clark, Radha Poovendran
- Abstract summary: We study safety-critical control synthesis for robotic systems under sensor faults and attacks.
Our main contribution is the development and synthesis of a new class of CBFs that we term fault tolerant neural control barrier function (FT-NCBF)
- Score: 6.314000948709254
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Safety is a fundamental requirement of many robotic systems. Control barrier
function (CBF)-based approaches have been proposed to guarantee the safety of
robotic systems. However, the effectiveness of these approaches highly relies
on the choice of CBFs. Inspired by the universal approximation power of neural
networks, there is a growing trend toward representing CBFs using neural
networks, leading to the notion of neural CBFs (NCBFs). Current NCBFs, however,
are trained and deployed in benign environments, making them ineffective for
scenarios where robotic systems experience sensor faults and attacks. In this
paper, we study safety-critical control synthesis for robotic systems under
sensor faults and attacks. Our main contribution is the development and
synthesis of a new class of CBFs that we term fault tolerant neural control
barrier function (FT-NCBF). We derive the necessary and sufficient conditions
for FT-NCBFs to guarantee safety, and develop a data-driven method to learn
FT-NCBFs by minimizing a loss function constructed using the derived
conditions. Using the learned FT-NCBF, we synthesize a control input and
formally prove the safety guarantee provided by our approach. We demonstrate
our proposed approach using two case studies: obstacle avoidance problem for an
autonomous mobile robot and spacecraft rendezvous problem, with code available
via https://github.com/HongchaoZhang-HZ/FTNCBF.
Related papers
- Verification of Neural Control Barrier Functions with Symbolic Derivative Bounds Propagation [6.987300771372427]
We propose a new efficient verification framework for ReLU-based neural CBFs.
We show that the symbolic bounds can be propagated through the inner product of neural CBF Jacobian and nonlinear system dynamics.
arXiv Detail & Related papers (2024-10-04T21:42:25Z) - Learning Local Control Barrier Functions for Safety Control of Hybrid
Systems [11.57209279619218]
Safety is a primary concern for hybrid robotic systems.
Existing safety-critical control approaches for hybrid systems are either computationally inefficient, detrimental to system performance, or limited to small-scale systems.
We propose a learningenabled approach to construct local Control Barrier Functions (CBFs) to guarantee the safety of a wide class of nonlinear hybrid dynamical systems.
arXiv Detail & Related papers (2024-01-26T14:38:43Z) - Safe Neural Control for Non-Affine Control Systems with Differentiable
Control Barrier Functions [58.19198103790931]
This paper addresses the problem of safety-critical control for non-affine control systems.
It has been shown that optimizing quadratic costs subject to state and control constraints can be sub-optimally reduced to a sequence of quadratic programs (QPs) by using Control Barrier Functions (CBFs)
We incorporate higher-order CBFs into neural ordinary differential equation-based learning models as differentiable CBFs to guarantee safety for non-affine control systems.
arXiv Detail & Related papers (2023-09-06T05:35:48Z) - Safe Control Under Input Limits with Neural Control Barrier Functions [3.5270468102327004]
We propose new methods to synthesize control barrier function (CBF)-based safe controllers that avoid input saturation.
We leverage techniques from machine learning, like neural networks and deep learning, to simplify this challenging problem in nonlinear control design.
We provide empirical results on a 10D state, 4D input quadcopter-pendulum system.
arXiv Detail & Related papers (2022-11-20T19:01:37Z) - Recursively Feasible Probabilistic Safe Online Learning with Control Barrier Functions [60.26921219698514]
We introduce a model-uncertainty-aware reformulation of CBF-based safety-critical controllers.
We then present the pointwise feasibility conditions of the resulting safety controller.
We use these conditions to devise an event-triggered online data collection strategy.
arXiv Detail & Related papers (2022-08-23T05:02:09Z) - BarrierNet: A Safety-Guaranteed Layer for Neural Networks [50.86816322277293]
BarrierNet allows the safety constraints of a neural controller be adaptable to changing environments.
We evaluate them on a series of control problems such as traffic merging and robot navigations in 2D and 3D space.
arXiv Detail & Related papers (2021-11-22T15:38:11Z) - SABER: Data-Driven Motion Planner for Autonomously Navigating
Heterogeneous Robots [112.2491765424719]
We present an end-to-end online motion planning framework that uses a data-driven approach to navigate a heterogeneous robot team towards a global goal.
We use model predictive control (SMPC) to calculate control inputs that satisfy robot dynamics, and consider uncertainty during obstacle avoidance with chance constraints.
recurrent neural networks are used to provide a quick estimate of future state uncertainty considered in the SMPC finite-time horizon solution.
A Deep Q-learning agent is employed to serve as a high-level path planner, providing the SMPC with target positions that move the robots towards a desired global goal.
arXiv Detail & Related papers (2021-08-03T02:56:21Z) - Pointwise Feasibility of Gaussian Process-based Safety-Critical Control
under Model Uncertainty [77.18483084440182]
Control Barrier Functions (CBFs) and Control Lyapunov Functions (CLFs) are popular tools for enforcing safety and stability of a controlled system, respectively.
We present a Gaussian Process (GP)-based approach to tackle the problem of model uncertainty in safety-critical controllers that use CBFs and CLFs.
arXiv Detail & Related papers (2021-06-13T23:08:49Z) - Learning Control Barrier Functions from Expert Demonstrations [69.23675822701357]
We propose a learning based approach to safe controller synthesis based on control barrier functions (CBFs)
We analyze an optimization-based approach to learning a CBF that enjoys provable safety guarantees under suitable Lipschitz assumptions on the underlying dynamical system.
To the best of our knowledge, these are the first results that learn provably safe control barrier functions from data.
arXiv Detail & Related papers (2020-04-07T12:29:06Z) - Training Neural Network Controllers Using Control Barrier Functions in
the Presence of Disturbances [9.21721532941863]
We propose to use imitation learning to learn Neural Network-based feedback controllers.
We also develop a new class of High Order CBF for systems under external disturbances.
arXiv Detail & Related papers (2020-01-18T18:43:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.