Automated and Sound Synthesis of Lyapunov Functions with SMT Solvers
- URL: http://arxiv.org/abs/2007.10865v1
- Date: Tue, 21 Jul 2020 14:45:23 GMT
- Title: Automated and Sound Synthesis of Lyapunov Functions with SMT Solvers
- Authors: Daniele Ahmed, Andrea Peruffo, Alessandro Abate
- Abstract summary: We synthesise Lyapunov functions for linear, non-linear (polynomial) and parametric models.
We exploit an inductive framework to synthesise Lyapunov functions, starting from parametric templates.
- Score: 70.70479436076238
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we employ SMT solvers to soundly synthesise Lyapunov functions
that assert the stability of a given dynamical model. The search for a Lyapunov
function is framed as the satisfiability of a second-order logical formula,
asking whether there exists a function satisfying a desired specification
(stability) for all possible initial conditions of the model. We synthesise
Lyapunov functions for linear, non-linear (polynomial), and for parametric
models. For non-linear models, the algorithm also determines a region of
validity for the Lyapunov function. We exploit an inductive framework to
synthesise Lyapunov functions, starting from parametric templates. The
inductive framework comprises two elements: a learner proposes a Lyapunov
function, and a verifier checks its validity - its lack is expressed via a
counterexample (a point over the state space), for further use by the learner.
Whilst the verifier uses the SMT solver Z3, thus ensuring the overall soundness
of the procedure, we examine two alternatives for the learner: a numerical
approach based on the optimisation tool Gurobi, and a sound approach based
again on Z3. The overall technique is evaluated over a broad set of benchmarks,
which shows that this methodology not only scales to 10-dimensional models
within reasonable computational time, but also offers a novel soundness proof
for the generated Lyapunov functions and their domains of validity.
Related papers
- Combining Neural Networks and Symbolic Regression for Analytical Lyapunov Function Discovery [3.803654983282309]
We propose CoNSAL (Combining Neural networks and regression for Analytical Lyapunov function) to construct analytical Lyapunov functions for nonlinear dynamic systems.
This framework contains a neural Lyapunov function and a symbolic regression component, where symbolic regression is applied to distill the neural network to precise analytical forms.
arXiv Detail & Related papers (2024-06-21T22:31:06Z) - RUMBoost: Gradient Boosted Random Utility Models [0.0]
The RUMBoost model combines the interpretability and behavioural robustness of Random Utility Models (RUMs) with the generalisation and predictive ability of deep learning methods.
We demonstrate the potential of the RUMBoost model compared to various ML and Random Utility benchmark models for revealed preference mode choice data from London.
arXiv Detail & Related papers (2024-01-22T13:54:26Z) - Efficient Model-Free Exploration in Low-Rank MDPs [76.87340323826945]
Low-Rank Markov Decision Processes offer a simple, yet expressive framework for RL with function approximation.
Existing algorithms are either (1) computationally intractable, or (2) reliant upon restrictive statistical assumptions.
We propose the first provably sample-efficient algorithm for exploration in Low-Rank MDPs.
arXiv Detail & Related papers (2023-07-08T15:41:48Z) - FAENet: Frame Averaging Equivariant GNN for Materials Modeling [123.19473575281357]
We introduce a flexible framework relying on frameaveraging (SFA) to make any model E(3)-equivariant or invariant through data transformations.
We prove the validity of our method theoretically and empirically demonstrate its superior accuracy and computational scalability in materials modeling.
arXiv Detail & Related papers (2023-04-28T21:48:31Z) - Reinforcement Learning from Partial Observation: Linear Function Approximation with Provable Sample Efficiency [111.83670279016599]
We study reinforcement learning for partially observed decision processes (POMDPs) with infinite observation and state spaces.
We make the first attempt at partial observability and function approximation for a class of POMDPs with a linear structure.
arXiv Detail & Related papers (2022-04-20T21:15:38Z) - High-dimensional Functional Graphical Model Structure Learning via
Neighborhood Selection Approach [15.334392442475115]
We propose a neighborhood selection approach to estimate the structure of functional graphical models.
We thus circumvent the need for a well-defined precision operator that may not exist when the functions are infinite dimensional.
arXiv Detail & Related papers (2021-05-06T07:38:50Z) - Sinkhorn Natural Gradient for Generative Models [125.89871274202439]
We propose a novel Sinkhorn Natural Gradient (SiNG) algorithm which acts as a steepest descent method on the probability space endowed with the Sinkhorn divergence.
We show that the Sinkhorn information matrix (SIM), a key component of SiNG, has an explicit expression and can be evaluated accurately in complexity that scales logarithmically.
In our experiments, we quantitatively compare SiNG with state-of-the-art SGD-type solvers on generative tasks to demonstrate its efficiency and efficacy of our method.
arXiv Detail & Related papers (2020-11-09T02:51:17Z) - The connections between Lyapunov functions for some optimization
algorithms and differential equations [0.0]
We derive analytically a (discrete) Lyapunov function for a family of Nesterov optimization methods.
We show that the majority of typical discretizations of the family of ODEs, such as the Heavy ball method, do not possess Lyapunov functions with properties similar to those of the Lyapunov function constructed here.
arXiv Detail & Related papers (2020-09-01T19:49:11Z) - Formal Synthesis of Lyapunov Neural Networks [61.79595926825511]
We propose an automatic and formally sound method for synthesising Lyapunov functions.
We employ a counterexample-guided approach where a numerical learner and a symbolic verifier interact to construct provably correct Lyapunov neural networks.
Our method synthesises Lyapunov functions faster and over wider spatial domains than the alternatives, yet providing stronger or equal guarantees.
arXiv Detail & Related papers (2020-03-19T17:21:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.