Adversarially Robust Stability Certificates can be Sample-Efficient
- URL: http://arxiv.org/abs/2112.10690v1
- Date: Mon, 20 Dec 2021 17:23:31 GMT
- Title: Adversarially Robust Stability Certificates can be Sample-Efficient
- Authors: Thomas T.C.K. Zhang, Stephen Tu, Nicholas M. Boffi, Jean-Jacques E.
Slotine, Nikolai Matni
- Abstract summary: We consider learning adversarially robust stability certificates for unknown nonlinear dynamical systems.
We show that the statistical cost of learning an adversarial stability certificate is equivalent, up to constant factors, to that of learning a nominal stability certificate.
- Score: 14.658040519472646
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Motivated by bridging the simulation to reality gap in the context of
safety-critical systems, we consider learning adversarially robust stability
certificates for unknown nonlinear dynamical systems. In line with approaches
from robust control, we consider additive and Lipschitz bounded adversaries
that perturb the system dynamics. We show that under suitable assumptions of
incremental stability on the underlying system, the statistical cost of
learning an adversarial stability certificate is equivalent, up to constant
factors, to that of learning a nominal stability certificate. Our results hinge
on novel bounds for the Rademacher complexity of the resulting adversarial loss
class, which may be of independent interest. To the best of our knowledge, this
is the first characterization of sample-complexity bounds when performing
adversarial learning over data generated by a dynamical system. We further
provide a practical algorithm for approximating the adversarial training
algorithm, and validate our findings on a damped pendulum example.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.