On Second Order Behaviour in Augmented Neural ODEs
- URL: http://arxiv.org/abs/2006.07220v2
- Date: Wed, 21 Oct 2020 13:59:14 GMT
- Title: On Second Order Behaviour in Augmented Neural ODEs
- Authors: Alexander Norcliffe, Cristian Bodnar, Ben Day, Nikola Simidjievski,
Pietro Li\`o
- Abstract summary: We consider Second Order Neural ODEs (SONODEs)
We show how the adjoint sensitivity method can be extended to SONODEs.
We extend the theoretical understanding of the broader class of Augmented NODEs (ANODEs)
- Score: 69.8070643951126
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Ordinary Differential Equations (NODEs) are a new class of models that
transform data continuously through infinite-depth architectures. The
continuous nature of NODEs has made them particularly suitable for learning the
dynamics of complex physical systems. While previous work has mostly been
focused on first order ODEs, the dynamics of many systems, especially in
classical physics, are governed by second order laws. In this work, we consider
Second Order Neural ODEs (SONODEs). We show how the adjoint sensitivity method
can be extended to SONODEs and prove that the optimisation of a first order
coupled ODE is equivalent and computationally more efficient. Furthermore, we
extend the theoretical understanding of the broader class of Augmented NODEs
(ANODEs) by showing they can also learn higher order dynamics with a minimal
number of augmented dimensions, but at the cost of interpretability. This
indicates that the advantages of ANODEs go beyond the extra space offered by
the augmented dimensions, as originally thought. Finally, we compare SONODEs
and ANODEs on synthetic and real dynamical systems and demonstrate that the
inductive biases of the former generally result in faster training and better
performance.
Related papers
- Pioneer: Physics-informed Riemannian Graph ODE for Entropy-increasing Dynamics [61.70424540412608]
We present a physics-informed graph ODE for a wide range of entropy-increasing dynamic systems.
We report the provable entropy non-decreasing of our formulation, obeying the physics laws.
Empirical results show the superiority of Pioneer on real datasets.
arXiv Detail & Related papers (2025-02-05T14:54:30Z) - ControlSynth Neural ODEs: Modeling Dynamical Systems with Guaranteed Convergence [1.1720409777196028]
Neural ODEs (NODEs) are continuous-time neural networks (NNs) that can process data without the limitation of time intervals.
We show that despite their highly nonlinear nature, convergence can be guaranteed via tractable linear inequalities.
In the composition of CSODEs, we introduce an extra control term for learning the potential simultaneous capture of dynamics at different scales.
arXiv Detail & Related papers (2024-11-04T17:20:42Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Embedding Capabilities of Neural ODEs [0.0]
We study input-output relations of neural ODEs using dynamical systems theory.
We prove several results about the exact embedding of maps in different neural ODE architectures in low and high dimension.
arXiv Detail & Related papers (2023-08-02T15:16:34Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Heavy Ball Neural Ordinary Differential Equations [12.861233366398162]
We propose heavy ball neural ordinary differential equations (HBNODEs) to improve neural ODEs (NODEs) training and inference.
HBNODEs have two properties that imply practical advantages over NODEs.
We verify the advantages of HBNODEs over NODEs on benchmark tasks, including image classification, learning complex dynamics, and sequential modeling.
arXiv Detail & Related papers (2021-10-10T16:11:11Z) - Modular Neural Ordinary Differential Equations [0.0]
We propose Modular Neural ODEs, where each force component is learned with separate modules.
We show how physical priors can be easily incorporated into these models.
arXiv Detail & Related papers (2021-09-15T15:13:12Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.