Forward Invariance in Neural Network Controlled Systems
- URL: http://arxiv.org/abs/2309.09043v2
- Date: Sat, 9 Dec 2023 23:42:28 GMT
- Title: Forward Invariance in Neural Network Controlled Systems
- Authors: Akash Harapanahalli, Saber Jafarpour, Samuel Coogan
- Abstract summary: We present a framework based on interval analysis and monotone systems theory to certify and search for forward invariant sets in nonlinear systems with neural network controllers.
The framework is automated in Python using our interval analysis toolbox $textttnpinterval$, in conjunction with the symbolic arithmetic toolbox $textttsympy$, demonstrated on an $8$-dimensional leader-follower system.
- Score: 5.359060261460183
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a framework based on interval analysis and monotone systems theory
to certify and search for forward invariant sets in nonlinear systems with
neural network controllers. The framework (i) constructs localized first-order
inclusion functions for the closed-loop system using Jacobian bounds and
existing neural network verification tools; (ii) builds a dynamical embedding
system where its evaluation along a single trajectory directly corresponds with
a nested family of hyper-rectangles provably converging to an attractive set of
the original system; (iii) utilizes linear transformations to build families of
nested paralleletopes with the same properties. The framework is automated in
Python using our interval analysis toolbox $\texttt{npinterval}$, in
conjunction with the symbolic arithmetic toolbox $\texttt{sympy}$, demonstrated
on an $8$-dimensional leader-follower system.
Related papers
- Neural Control Variates with Automatic Integration [49.91408797261987]
This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures.
We use the network to approximate the anti-derivative of the integrand.
We apply our method to solve partial differential equations using the Walk-on-sphere algorithm.
arXiv Detail & Related papers (2024-09-23T06:04:28Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Stable Nonconvex-Nonconcave Training via Linear Interpolation [51.668052890249726]
This paper presents a theoretical analysis of linearahead as a principled method for stabilizing (large-scale) neural network training.
We argue that instabilities in the optimization process are often caused by the nonmonotonicity of the loss landscape and show how linear can help by leveraging the theory of nonexpansive operators.
arXiv Detail & Related papers (2023-10-20T12:45:12Z) - Efficient Interaction-Aware Interval Analysis of Neural Network Feedback Loops [4.768272342753616]
We propose a computationally efficient framework for interval reachability of systems with neural network controllers.
We use inclusion functions for the open-loop system and the neural network controller to embed the closed-loop system into a larger-dimensional embedding system.
arXiv Detail & Related papers (2023-07-27T15:30:22Z) - A Toolbox for Fast Interval Arithmetic in numpy with an Application to
Formal Verification of Neural Network Controlled Systems [5.543220407902113]
We present a toolbox for interval analysis in numpy, with an application to formal verification of neural network controlled systems.
The toolbox offers efficient computation of natural inclusion functions using compiled C code, as well as a familiar interface in numpy.
We then use this toolbox in formal verification of dynamical systems with neural network controllers, through the composition of their inclusion functions.
arXiv Detail & Related papers (2023-06-27T09:50:47Z) - Interval Reachability of Nonlinear Dynamical Systems with Neural Network
Controllers [5.543220407902113]
This paper proposes a computationally efficient framework, based on interval analysis, for rigorous verification of nonlinear continuous-time dynamical systems with neural network controllers.
Inspired by mixed monotone theory, we embed the closed-loop dynamics into a larger system using an inclusion function of the neural network and a decomposition function of the open-loop system.
We show that one can efficiently compute hyper-rectangular over-approximations of the reachable sets using a single trajectory of the embedding system.
arXiv Detail & Related papers (2023-01-19T06:46:36Z) - Automated Reachability Analysis of Neural Network-Controlled Systems via
Adaptive Polytopes [2.66512000865131]
We develop a new approach for over-approximating the reachable sets of neural network dynamical systems using adaptive template polytopes.
We illustrate the utility of the proposed approach in the reachability analysis of linear systems driven by neural network controllers.
arXiv Detail & Related papers (2022-12-14T23:49:53Z) - Robust Training and Verification of Implicit Neural Networks: A
Non-Euclidean Contractive Approach [64.23331120621118]
This paper proposes a theoretical and computational framework for training and robustness verification of implicit neural networks.
We introduce a related embedded network and show that the embedded network can be used to provide an $ell_infty$-norm box over-approximation of the reachable sets of the original network.
We apply our algorithms to train implicit neural networks on the MNIST dataset and compare the robustness of our models with the models trained via existing approaches in the literature.
arXiv Detail & Related papers (2022-08-08T03:13:24Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Certifying Incremental Quadratic Constraints for Neural Networks via
Convex Optimization [2.388501293246858]
We propose a convex program to certify incremental quadratic constraints on the map of neural networks over a region of interest.
certificates can capture several useful properties such as (local) Lipschitz continuity, one-sided Lipschitz continuity, invertibility, and contraction.
arXiv Detail & Related papers (2020-12-10T21:15:00Z) - Neural Subdivision [58.97214948753937]
This paper introduces Neural Subdivision, a novel framework for data-driven coarseto-fine geometry modeling.
We optimize for the same set of network weights across all local mesh patches, thus providing an architecture that is not constrained to a specific input mesh, fixed genus, or category.
We demonstrate that even when trained on a single high-resolution mesh our method generates reasonable subdivisions for novel shapes.
arXiv Detail & Related papers (2020-05-04T20:03:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.