Verification of Neural Network Control Systems using Symbolic Zonotopes
and Polynotopes
- URL: http://arxiv.org/abs/2306.14619v1
- Date: Mon, 26 Jun 2023 11:52:14 GMT
- Title: Verification of Neural Network Control Systems using Symbolic Zonotopes
and Polynotopes
- Authors: Carlos Trapiello, Christophe Combastel, Ali Zolghadri
- Abstract summary: Verification and safety assessment of neural network controlled systems (NNCSs) is an emerging challenge.
To provide guarantees, verification tools must efficiently capture the interplay between the neural network and the physical system within the control loop.
A compositional approach focused on preserving long term symbolic dependency is proposed for the analysis of NNCSs.
- Score: 1.0312968200748116
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Verification and safety assessment of neural network controlled systems
(NNCSs) is an emerging challenge. To provide guarantees, verification tools
must efficiently capture the interplay between the neural network and the
physical system within the control loop. In this paper, a compositional
approach focused on inclusion preserving long term symbolic dependency modeling
is proposed for the analysis of NNCSs. First of all, the matrix structure of
symbolic zonotopes is exploited to efficiently abstract the input/output
mapping of the loop elements through (inclusion preserving) affine symbolic
expressions, thus maintaining linear dependencies between interacting blocks.
Then, two further extensions are studied. Firstly, symbolic polynotopes are
used to abstract the loop elements behaviour by means of polynomial symbolic
expressions and dependencies. Secondly, an original input partitioning
algorithm takes advantage of symbol preservation to assess the sensitivity of
the computed approximation to some input directions. The approach is evaluated
via different numerical examples and benchmarks. A good trade-off between low
conservatism and computational efficiency is obtained.
Related papers
- Parseval Convolution Operators and Neural Networks [16.78532039510369]
We first identify the Parseval convolution operators as the class of energy-preserving filterbanks.
We then present a constructive approach for the design/specification of such filterbanks via the chaining of elementary Parseval modules.
We demonstrate the usage of those tools with the design of a CNN-based algorithm for the iterative reconstruction of biomedical images.
arXiv Detail & Related papers (2024-08-19T13:31:16Z) - Semantic Loss Functions for Neuro-Symbolic Structured Prediction [74.18322585177832]
We discuss the semantic loss, which injects knowledge about such structure, defined symbolically, into training.
It is agnostic to the arrangement of the symbols, and depends only on the semantics expressed thereby.
It can be combined with both discriminative and generative neural models.
arXiv Detail & Related papers (2024-05-12T22:18:25Z) - The Role of Foundation Models in Neuro-Symbolic Learning and Reasoning [54.56905063752427]
Neuro-Symbolic AI (NeSy) holds promise to ensure the safe deployment of AI systems.
Existing pipelines that train the neural and symbolic components sequentially require extensive labelling.
New architecture, NeSyGPT, fine-tunes a vision-language foundation model to extract symbolic features from raw data.
arXiv Detail & Related papers (2024-02-02T20:33:14Z) - Point-Based Value Iteration for POMDPs with Neural Perception Mechanisms [31.51588071503617]
We introduce neuro-symbolic partially observable Markov decision processes (NS-POMDPs)
We propose a novel piecewise linear and convex representation (P-PWLC) in terms of polyhedra covering the state space and value vectors.
We show the practical applicability of our approach on two case studies that employ (trained) ReLU neural networks as perception functions.
arXiv Detail & Related papers (2023-06-30T13:26:08Z) - Provable Preimage Under-Approximation for Neural Networks (Full Version) [27.519993407376862]
We propose an efficient anytime algorithm for generating symbolic under-approximations of the preimage of any polyhedron output set for neural networks.
Empirically, we validate the efficacy of our method across a range of domains, including a high-dimensional MNIST classification task.
We present a sound and complete algorithm for the former, which exploits our disjoint union of polytopes representation to provide formal guarantees.
arXiv Detail & Related papers (2023-05-05T16:55:27Z) - Interval Reachability of Nonlinear Dynamical Systems with Neural Network
Controllers [5.543220407902113]
This paper proposes a computationally efficient framework, based on interval analysis, for rigorous verification of nonlinear continuous-time dynamical systems with neural network controllers.
Inspired by mixed monotone theory, we embed the closed-loop dynamics into a larger system using an inclusion function of the neural network and a decomposition function of the open-loop system.
We show that one can efficiently compute hyper-rectangular over-approximations of the reachable sets using a single trajectory of the embedding system.
arXiv Detail & Related papers (2023-01-19T06:46:36Z) - Symbolic Distillation for Learned TCP Congestion Control [70.27367981153299]
TCP congestion control has achieved tremendous success with deep reinforcement learning (RL) approaches.
Black-box policies lack interpretability and reliability, and often, they need to operate outside the traditional TCP datapath.
This paper proposes a novel two-stage solution to achieve the best of both worlds: first, to train a deep RL agent, then distill its NN policy into white-box, light-weight rules.
arXiv Detail & Related papers (2022-10-24T00:58:16Z) - Deep Equilibrium Assisted Block Sparse Coding of Inter-dependent
Signals: Application to Hyperspectral Imaging [71.57324258813675]
A dataset of inter-dependent signals is defined as a matrix whose columns demonstrate strong dependencies.
A neural network is employed to act as structure prior and reveal the underlying signal interdependencies.
Deep unrolling and Deep equilibrium based algorithms are developed, forming highly interpretable and concise deep-learning-based architectures.
arXiv Detail & Related papers (2022-03-29T21:00:39Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Closed Loop Neural-Symbolic Learning via Integrating Neural Perception,
Grammar Parsing, and Symbolic Reasoning [134.77207192945053]
Prior methods learn the neural-symbolic models using reinforcement learning approaches.
We introduce the textbfgrammar model as a textitsymbolic prior to bridge neural perception and symbolic reasoning.
We propose a novel textbfback-search algorithm which mimics the top-down human-like learning procedure to propagate the error.
arXiv Detail & Related papers (2020-06-11T17:42:49Z) - ENIGMA Anonymous: Symbol-Independent Inference Guiding Machine (system
description) [0.4893345190925177]
We describe an implementation of gradient boosting and neural guidance of saturation-style automated theorem provers.
For the gradient-boosting guidance, we manually create abstracted features by considering arity-based encodings of formulas.
For the neural guidance, we use symbol-independent graph neural networks (GNNs) and their embedding of the terms and clauses.
arXiv Detail & Related papers (2020-02-13T09:44:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.