A Toolbox for Fast Interval Arithmetic in numpy with an Application to
Formal Verification of Neural Network Controlled Systems
- URL: http://arxiv.org/abs/2306.15340v1
- Date: Tue, 27 Jun 2023 09:50:47 GMT
- Title: A Toolbox for Fast Interval Arithmetic in numpy with an Application to
Formal Verification of Neural Network Controlled Systems
- Authors: Akash Harapanahalli, Saber Jafarpour, Samuel Coogan
- Abstract summary: We present a toolbox for interval analysis in numpy, with an application to formal verification of neural network controlled systems.
The toolbox offers efficient computation of natural inclusion functions using compiled C code, as well as a familiar interface in numpy.
We then use this toolbox in formal verification of dynamical systems with neural network controllers, through the composition of their inclusion functions.
- Score: 5.543220407902113
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we present a toolbox for interval analysis in numpy, with an
application to formal verification of neural network controlled systems. Using
the notion of natural inclusion functions, we systematically construct interval
bounds for a general class of mappings. The toolbox offers efficient
computation of natural inclusion functions using compiled C code, as well as a
familiar interface in numpy with its canonical features, such as n-dimensional
arrays, matrix/vector operations, and vectorization. We then use this toolbox
in formal verification of dynamical systems with neural network controllers,
through the composition of their inclusion functions.
Related papers
- Simulation-based inference with the Python Package sbijax [0.7499722271664147]
sbijax is a Python package that implements a wide variety of state-of-the-art methods in neural simulation-based inference.
The package provides functionality for approximate Bayesian computation, to compute model diagnostics, and to automatically estimate summary statistics.
arXiv Detail & Related papers (2024-09-28T18:47:13Z) - Forward Invariance in Neural Network Controlled Systems [5.359060261460183]
We present a framework based on interval analysis and monotone systems theory to certify and search for forward invariant sets in nonlinear systems with neural network controllers.
The framework is automated in Python using our interval analysis toolbox $textttnpinterval$, in conjunction with the symbolic arithmetic toolbox $textttsympy$, demonstrated on an $8$-dimensional leader-follower system.
arXiv Detail & Related papers (2023-09-16T16:49:19Z) - A Data-Driven Hybrid Automaton Framework to Modeling Complex Dynamical
Systems [2.610470075814367]
A data-driven hybrid automaton model is proposed to capture unknown complex dynamical system behaviors.
Small-scale neural networks are trained as the local dynamical description for their corresponding topologies.
A numerical example of the limit cycle is presented to illustrate that the developed models can significantly reduce the computational cost in reachable set computation.
arXiv Detail & Related papers (2023-04-26T20:18:12Z) - Classified as unknown: A novel Bayesian neural network [0.0]
We develop a new efficient Bayesian learning algorithm for fully connected neural networks.
We generalize the algorithm for a single perceptron for binary classification in citeH to multi-layer perceptrons for multi-class classification.
arXiv Detail & Related papers (2023-01-31T04:27:09Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Adaptive Group Lasso Neural Network Models for Functions of Few
Variables and Time-Dependent Data [4.18804572788063]
We approximate the target function by a deep neural network and enforce an adaptive group Lasso constraint to the weights of a suitable hidden layer.
Our empirical studies show that the proposed method outperforms recent state-of-the-art methods including the sparse dictionary matrix method.
arXiv Detail & Related papers (2021-08-24T16:16:46Z) - Intersection Regularization for Extracting Semantic Attributes [72.53481390411173]
We consider the problem of supervised classification, such that the features that the network extracts match an unseen set of semantic attributes.
For example, when learning to classify images of birds into species, we would like to observe the emergence of features that zoologists use to classify birds.
We propose training a neural network with discrete top-level activations, which is followed by a multi-layered perceptron (MLP) and a parallel decision tree.
arXiv Detail & Related papers (2021-03-22T14:32:44Z) - UNIPoint: Universally Approximating Point Processes Intensities [125.08205865536577]
We provide a proof that a class of learnable functions can universally approximate any valid intensity function.
We implement UNIPoint, a novel neural point process model, using recurrent neural networks to parameterise sums of basis function upon each event.
arXiv Detail & Related papers (2020-07-28T09:31:56Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - A Trainable Optimal Transport Embedding for Feature Aggregation and its
Relationship to Attention [96.77554122595578]
We introduce a parametrized representation of fixed size, which embeds and then aggregates elements from a given input set according to the optimal transport plan between the set and a trainable reference.
Our approach scales to large datasets and allows end-to-end training of the reference, while also providing a simple unsupervised learning mechanism with small computational cost.
arXiv Detail & Related papers (2020-06-22T08:35:58Z) - Formal Synthesis of Lyapunov Neural Networks [61.79595926825511]
We propose an automatic and formally sound method for synthesising Lyapunov functions.
We employ a counterexample-guided approach where a numerical learner and a symbolic verifier interact to construct provably correct Lyapunov neural networks.
Our method synthesises Lyapunov functions faster and over wider spatial domains than the alternatives, yet providing stronger or equal guarantees.
arXiv Detail & Related papers (2020-03-19T17:21:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.