Closed-loop Analysis of Vision-based Autonomous Systems: A Case Study
- URL: http://arxiv.org/abs/2302.04634v1
- Date: Mon, 6 Feb 2023 18:56:20 GMT
- Title: Closed-loop Analysis of Vision-based Autonomous Systems: A Case Study
- Authors: Corina S. Pasareanu, Ravi Mangal, Divya Gopinath, Sinem Getir Yaman,
Calum Imrie, Radu Calinescu, and Huafeng Yu
- Abstract summary: We present a case study applying formal probabilistic analysis techniques to an experimental autonomous system.
We show how to leverage local, DNN-specific analyses as run-time guards to increase the safety of the overall system.
- Score: 16.776221250574075
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural networks (DNNs) are increasingly used in safety-critical
autonomous systems as perception components processing high-dimensional image
data. Formal analysis of these systems is particularly challenging due to the
complexity of the perception DNNs, the sensors (cameras), and the environment
conditions. We present a case study applying formal probabilistic analysis
techniques to an experimental autonomous system that guides airplanes on
taxiways using a perception DNN. We address the above challenges by replacing
the camera and the network with a compact probabilistic abstraction built from
the confusion matrices computed for the DNN on a representative image data set.
We also show how to leverage local, DNN-specific analyses as run-time guards to
increase the safety of the overall system. Our findings are applicable to other
autonomous systems that use complex DNNs for perception.
Related papers
- Search-based DNN Testing and Retraining with GAN-enhanced Simulations [2.362412515574206]
In safety-critical systems, Deep Neural Networks (DNNs) are becoming a key component for computer vision tasks.
We propose to combine meta-heuristic search, used to explore the input space using simulators, with Generative Adversarial Networks (GANs) to transform the data generated by simulators into realistic input images.
arXiv Detail & Related papers (2024-06-19T09:05:16Z) - Scaling #DNN-Verification Tools with Efficient Bound Propagation and
Parallel Computing [57.49021927832259]
Deep Neural Networks (DNNs) are powerful tools that have shown extraordinary results in many scenarios.
However, their intricate designs and lack of transparency raise safety concerns when applied in real-world applications.
Formal Verification (FV) of DNNs has emerged as a valuable solution to provide provable guarantees on the safety aspect.
arXiv Detail & Related papers (2023-12-10T13:51:25Z) - Assumption Generation for the Verification of Learning-Enabled
Autonomous Systems [7.580719272198119]
We present an assume-guarantee style compositional approach for the formal verification of system-level safety properties.
We illustrate our approach on a case study taken from the autonomous airplanes domain.
arXiv Detail & Related papers (2023-05-27T23:30:27Z) - Taming Reachability Analysis of DNN-Controlled Systems via
Abstraction-Based Training [14.787056022080625]
This paper presents a novel abstraction-based approach to bypass the crux of over-approximating DNNs in reachability analysis.
We extend conventional DNNs by inserting an additional abstraction layer, which abstracts a real number to an interval for training.
We devise the first black-box reachability analysis approach for DNN-controlled systems, where trained DNNs are only queried as black-box oracles for the actions on abstract states.
arXiv Detail & Related papers (2022-11-21T00:11:50Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Black-box Safety Analysis and Retraining of DNNs based on Feature
Extraction and Clustering [0.9590956574213348]
We propose SAFE, a black-box approach to automatically characterize the root causes of DNN errors.
It relies on a transfer learning model pre-trained on ImageNet to extract the features from error-inducing images.
It then applies a density-based clustering algorithm to detect arbitrary shaped clusters of images modeling plausible causes of error.
arXiv Detail & Related papers (2022-01-13T17:02:57Z) - A novel Deep Neural Network architecture for non-linear system
identification [78.69776924618505]
We present a novel Deep Neural Network (DNN) architecture for non-linear system identification.
Inspired by fading memory systems, we introduce inductive bias (on the architecture) and regularization (on the loss function)
This architecture allows for automatic complexity selection based solely on available data.
arXiv Detail & Related papers (2021-06-06T10:06:07Z) - On the benefits of robust models in modulation recognition [53.391095789289736]
Deep Neural Networks (DNNs) using convolutional layers are state-of-the-art in many tasks in communications.
In other domains, like image classification, DNNs have been shown to be vulnerable to adversarial perturbations.
We propose a novel framework to test the robustness of current state-of-the-art models.
arXiv Detail & Related papers (2021-03-27T19:58:06Z) - Risk-Averse MPC via Visual-Inertial Input and Recurrent Networks for
Online Collision Avoidance [95.86944752753564]
We propose an online path planning architecture that extends the model predictive control (MPC) formulation to consider future location uncertainties.
Our algorithm combines an object detection pipeline with a recurrent neural network (RNN) which infers the covariance of state estimates.
The robustness of our methods is validated on complex quadruped robot dynamics and can be generally applied to most robotic platforms.
arXiv Detail & Related papers (2020-07-28T07:34:30Z) - Boosting Deep Neural Networks with Geometrical Prior Knowledge: A Survey [77.99182201815763]
Deep Neural Networks (DNNs) achieve state-of-the-art results in many different problem settings.
DNNs are often treated as black box systems, which complicates their evaluation and validation.
One promising field, inspired by the success of convolutional neural networks (CNNs) in computer vision tasks, is to incorporate knowledge about symmetric geometrical transformations.
arXiv Detail & Related papers (2020-06-30T14:56:05Z) - 1D CNN Based Network Intrusion Detection with Normalization on
Imbalanced Data [0.19336815376402716]
Intrusion detection system (IDS) plays an essential role in computer networks protecting computing resources and data from outside attacks.
Recent IDS faces challenges improving flexibility and efficiency of the IDS for unexpected and unpredictable attacks.
We propose a deep learning approach for developing the efficient and flexible IDS using one-dimensional Convolutional Neural Network (1D-CNN)
arXiv Detail & Related papers (2020-03-01T12:23:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.