A Self-Adaptive Penalty Method for Integrating Prior Knowledge
Constraints into Neural ODEs
- URL: http://arxiv.org/abs/2307.14940v3
- Date: Tue, 5 Mar 2024 07:29:02 GMT
- Title: A Self-Adaptive Penalty Method for Integrating Prior Knowledge
Constraints into Neural ODEs
- Authors: C. Coelho, M. Fernanda P. Costa, L. L. Ferr\'as
- Abstract summary: We propose a self-adaptive penalty algorithm for Neural ODEs to enable modelling of constrained natural systems.
We validate the proposed approach by modelling three natural systems with prior knowledge constraints.
The self-adaptive penalty approach provides more accurate and robust models with reliable and meaningful predictions.
- Score: 3.072340427031969
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The continuous dynamics of natural systems has been effectively modelled
using Neural Ordinary Differential Equations (Neural ODEs). However, for
accurate and meaningful predictions, it is crucial that the models follow the
underlying rules or laws that govern these systems. In this work, we propose a
self-adaptive penalty algorithm for Neural ODEs to enable modelling of
constrained natural systems. The proposed self-adaptive penalty function can
dynamically adjust the penalty parameters. The explicit introduction of prior
knowledge helps to increase the interpretability of Neural ODE -based models.
We validate the proposed approach by modelling three natural systems with prior
knowledge constraints: population growth, chemical reaction evolution, and
damped harmonic oscillator motion. The numerical experiments and a comparison
with other penalty Neural ODE approaches and \emph{vanilla} Neural ODE,
demonstrate the effectiveness of the proposed self-adaptive penalty algorithm
for Neural ODEs in modelling constrained natural systems. Moreover, the
self-adaptive penalty approach provides more accurate and robust models with
reliable and meaningful predictions.
Related papers
- Projected Neural Differential Equations for Learning Constrained Dynamics [3.570367665112327]
We introduce a new method for constraining neural differential equations based on projection of the learned vector field to the tangent space of the constraint manifold.
PNDEs outperform existing methods while requiring fewer hyper parameters.
The proposed approach demonstrates significant potential for enhancing the modeling of constrained dynamical systems.
arXiv Detail & Related papers (2024-10-31T06:32:43Z) - Probabilistic Decomposed Linear Dynamical Systems for Robust Discovery of Latent Neural Dynamics [5.841659874892801]
Time-varying linear state-space models are powerful tools for obtaining mathematically interpretable representations of neural signals.
Existing methods for latent variable estimation are not robust to dynamical noise and system nonlinearity.
We propose a probabilistic approach to latent variable estimation in decomposed models that improves robustness against dynamical noise.
arXiv Detail & Related papers (2024-08-29T18:58:39Z) - Individualized Dosing Dynamics via Neural Eigen Decomposition [51.62933814971523]
We introduce the Neural Eigen Differential Equation algorithm (NESDE)
NESDE provides individualized modeling, tunable generalization to new treatment policies, and fast, continuous, closed-form prediction.
We demonstrate the robustness of NESDE in both synthetic and real medical problems, and use the learned dynamics to publish simulated medical gym environments.
arXiv Detail & Related papers (2023-06-24T17:01:51Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Neural Abstractions [72.42530499990028]
We present a novel method for the safety verification of nonlinear dynamical models that uses neural networks to represent abstractions of their dynamics.
We demonstrate that our approach performs comparably to the mature tool Flow* on existing benchmark nonlinear models.
arXiv Detail & Related papers (2023-01-27T12:38:09Z) - Neural ODEs as Feedback Policies for Nonlinear Optimal Control [1.8514606155611764]
We use Neural ordinary differential equations (Neural ODEs) to model continuous time dynamics as differential equations parametrized with neural networks.
We propose the use of a neural control policy posed as a Neural ODE to solve general nonlinear optimal control problems.
arXiv Detail & Related papers (2022-10-20T13:19:26Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Cubature Kalman Filter Based Training of Hybrid Differential Equation
Recurrent Neural Network Physiological Dynamic Models [13.637931956861758]
We show how we can approximate missing ordinary differential equations with known ODEs using a neural network approximation.
Results indicate that this RBSE approach to training the NN parameters yields better outcomes (measurement/state estimation accuracy) than training the neural network with backpropagation.
arXiv Detail & Related papers (2021-10-12T15:38:13Z) - Accelerating Neural ODEs Using Model Order Reduction [0.0]
We show that mathematical model order reduction methods can be used for compressing and accelerating Neural ODEs.
We implement our novel compression method by developing Neural ODEs that integrate the necessary subspace-projection and operations as layers of the neural network.
arXiv Detail & Related papers (2021-05-28T19:27:09Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Meta-Solver for Neural Ordinary Differential Equations [77.8918415523446]
We investigate how the variability in solvers' space can improve neural ODEs performance.
We show that the right choice of solver parameterization can significantly affect neural ODEs models in terms of robustness to adversarial attacks.
arXiv Detail & Related papers (2021-03-15T17:26:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.