Modular Neural Ordinary Differential Equations
- URL: http://arxiv.org/abs/2109.07359v2
- Date: Thu, 16 Sep 2021 21:15:44 GMT
- Title: Modular Neural Ordinary Differential Equations
- Authors: Max Zhu, Pietro Lio, Jacob Moss
- Abstract summary: We propose Modular Neural ODEs, where each force component is learned with separate modules.
We show how physical priors can be easily incorporated into these models.
- Score: 0.0
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: The laws of physics have been written in the language of dif-ferential
equations for centuries. Neural Ordinary Differen-tial Equations (NODEs) are a
new machine learning architecture which allows these differential equations to
be learned from a dataset. These have been applied to classical dynamics
simulations in the form of Lagrangian Neural Net-works (LNNs) and Second Order
Neural Differential Equations (SONODEs). However, they either cannot represent
the most general equations of motion or lack interpretability. In this paper,
we propose Modular Neural ODEs, where each force component is learned with
separate modules. We show how physical priors can be easily incorporated into
these models. Through a number of experiments, we demonstrate these result in
better performance, are more interpretable, and add flexibility due to their
modularity.
Related papers
- Neural Fractional Differential Equations [2.812395851874055]
Fractional Differential Equations (FDEs) are essential tools for modelling complex systems in science and engineering.
We propose the Neural FDE, a novel deep neural network architecture that adjusts a FDE to the dynamics of data.
arXiv Detail & Related papers (2024-03-05T07:45:29Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Pseudo-Hamiltonian neural networks for learning partial differential
equations [0.0]
Pseudo-Hamiltonian neural networks (PHNN) were recently introduced for learning dynamical systems that can be modelled by ordinary differential equations.
In this paper, we extend the method to partial differential equations.
The resulting model is comprised of up to three neural networks, modelling terms representing conservation, dissipation and external forces, and discrete convolution operators that can either be learned or be given as input.
arXiv Detail & Related papers (2023-04-27T17:46:00Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - On Neural Differential Equations [13.503274710499971]
In particular, neural differential equations (NDEs) demonstrate that neural networks and differential equations are two sides of the same coin.
NDEs are suitable for tackling generative problems, dynamical systems, and time series.
NDEs offer high-capacity function approximation, strong priors on model space, the ability to handle irregular data, memory efficiency, and a wealth of available theory on both sides.
arXiv Detail & Related papers (2022-02-04T23:32:29Z) - On Second Order Behaviour in Augmented Neural ODEs [69.8070643951126]
We consider Second Order Neural ODEs (SONODEs)
We show how the adjoint sensitivity method can be extended to SONODEs.
We extend the theoretical understanding of the broader class of Augmented NODEs (ANODEs)
arXiv Detail & Related papers (2020-06-12T14:25:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.