Ortho-ODE: Enhancing Robustness and of Neural ODEs against Adversarial
Attacks
- URL: http://arxiv.org/abs/2305.09179v1
- Date: Tue, 16 May 2023 05:37:06 GMT
- Title: Ortho-ODE: Enhancing Robustness and of Neural ODEs against Adversarial
Attacks
- Authors: Vishal Purohit
- Abstract summary: We show that by controlling the Lipschitz constant of the ODE dynamics the robustness can be significantly improved.
We corroborate the enhanced robustness on numerous datasets.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Ordinary Differential Equations (NODEs) probed the usage of numerical
solvers to solve the differential equation characterized by a Neural Network
(NN), therefore initiating a new paradigm of deep learning models with infinite
depth. NODEs were designed to tackle the irregular time series problem.
However, NODEs have demonstrated robustness against various noises and
adversarial attacks. This paper is about the natural robustness of NODEs and
examines the cause behind such surprising behaviour. We show that by
controlling the Lipschitz constant of the ODE dynamics the robustness can be
significantly improved. We derive our approach from Grownwall's inequality.
Further, we draw parallels between contractivity theory and Grownwall's
inequality. Experimentally we corroborate the enhanced robustness on numerous
datasets - MNIST, CIFAR-10, and CIFAR 100. We also present the impact of
adaptive and non-adaptive solvers on the robustness of NODEs.
Related papers
- Unconstrained Parametrization of Dissipative and Contracting Neural
Ordinary Differential Equations [0.9437165725355698]
We introduce and study a class of Deep Neural Networks (DNNs) in continuous-time.
We show how to endow our proposed NodeRENs with contractivity and dissipativity -- crucial properties for robust learning and control.
arXiv Detail & Related papers (2023-04-06T10:02:54Z) - On Robust Classification using Contractive Hamiltonian Neural ODEs [8.049462923912902]
We employ contraction theory to improve robustness of neural ODEs (NODEs)
In NODEs, the input data corresponds to the initial condition of dynamical systems.
We propose a class of contractive Hamiltonian NODEs (CH-NODEs)
arXiv Detail & Related papers (2022-03-22T15:16:36Z) - Stable Neural ODE with Lyapunov-Stable Equilibrium Points for Defending
Against Adversarial Attacks [32.88499015927756]
We propose a stable neural ODE with Lyapunov-stable equilibrium points for defending against adversarial attacks (SODEF)
We provide theoretical results that give insights into the stability of SODEF as well as the choice of regularizers to ensure its stability.
arXiv Detail & Related papers (2021-10-25T14:09:45Z) - Adaptive Feature Alignment for Adversarial Training [56.17654691470554]
CNNs are typically vulnerable to adversarial attacks, which pose a threat to security-sensitive applications.
We propose the adaptive feature alignment (AFA) to generate features of arbitrary attacking strengths.
Our method is trained to automatically align features of arbitrary attacking strength.
arXiv Detail & Related papers (2021-05-31T17:01:05Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Meta-Solver for Neural Ordinary Differential Equations [77.8918415523446]
We investigate how the variability in solvers' space can improve neural ODEs performance.
We show that the right choice of solver parameterization can significantly affect neural ODEs models in terms of robustness to adversarial attacks.
arXiv Detail & Related papers (2021-03-15T17:26:34Z) - Attribute-Guided Adversarial Training for Robustness to Natural
Perturbations [64.35805267250682]
We propose an adversarial training approach which learns to generate new samples so as to maximize exposure of the classifier to the attributes-space.
Our approach enables deep neural networks to be robust against a wide range of naturally occurring perturbations.
arXiv Detail & Related papers (2020-12-03T10:17:30Z) - Weight-Covariance Alignment for Adversarially Robust Neural Networks [15.11530043291188]
We propose a new SNN that achieves state-of-the-art performance without relying on adversarial training.
While existing SNNs inject learned or hand-tuned isotropic noise, our SNN learns an anisotropic noise distribution to optimize a learning-theoretic bound on adversarial robustness.
arXiv Detail & Related papers (2020-10-17T19:28:35Z) - Hypersolvers: Toward Fast Continuous-Depth Models [16.43439140464003]
We introduce hypersolvers, neural networks designed to solve ODEs with low overhead and theoretical guarantees on accuracy.
The synergistic combination of hypersolvers and Neural ODEs allows for cheap inference and unlocks a new frontier for practical application of continuous-depth models.
arXiv Detail & Related papers (2020-07-19T06:31:31Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.