Learning Runge-Kutta Integration Schemes for ODE Simulation and
Identification
- URL: http://arxiv.org/abs/2105.04999v1
- Date: Tue, 11 May 2021 13:02:20 GMT
- Title: Learning Runge-Kutta Integration Schemes for ODE Simulation and
Identification
- Authors: Said Ouala, Laurent Debreu, Ananda Pascual, Bertrand Chapron, Fabrice
Collard, Lucile Gaultier and Ronan Fablet
- Abstract summary: We propose a novel framework to learn integration schemes that minimize an integration-related cost function.
We demonstrate the relevance of the proposed learning-based approach for non-linear equations.
- Score: 35.877707234293624
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deriving analytical solutions of ordinary differential equations is usually
restricted to a small subset of problems and numerical techniques are
considered. Inevitably, a numerical simulation of a differential equation will
then always be distinct from a true analytical solution. An efficient
integration scheme shall further not only provide a trajectory throughout a
given state, but also be derived to ensure the generated simulation to be close
to the analytical one. Consequently, several integration schemes were developed
for different classes of differential equations. Unfortunately, when
considering the integration of complex non-linear systems, as well as the
identification of non-linear equations from data, this choice of the
integration scheme is often far from being trivial. In this paper, we propose a
novel framework to learn integration schemes that minimize an
integration-related cost function. We demonstrate the relevance of the proposed
learning-based approach for non-linear equations and include a quantitative
analysis w.r.t. classical state-of-the-art integration techniques, especially
where the latter may not apply.
Related papers
- A Constant Velocity Latent Dynamics Approach for Accelerating Simulation of Stiff Nonlinear Systems [0.0]
Solving stiff ordinary differential equations (StODEs) requires sophisticated numerical solvers, which are often computationally expensive.
In this work, we embark on a different path which involves learning a latent dynamics for StODEs, in which one completely avoids numerical integration.
In other words, the solution of the original dynamics is encoded into a sequence of straight lines which can be decoded back to retrieve the actual solution as and when required.
arXiv Detail & Related papers (2025-01-14T20:32:31Z) - Neural Control Variates with Automatic Integration [49.91408797261987]
This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures.
We use the network to approximate the anti-derivative of the integrand.
We apply our method to solve partial differential equations using the Walk-on-sphere algorithm.
arXiv Detail & Related papers (2024-09-23T06:04:28Z) - Closed-form Solutions: A New Perspective on Solving Differential Equations [12.048106653998044]
This paper presents a novel machine learning-based solver, SSDE, which employs reinforcement learning to derive symbolic closed-form solutions for various differential equations.
Our evaluations on a range of ordinary and partial differential equations demonstrate that SSDE provides superior performance in achieving analytical solutions compared to other machine learning approaches.
arXiv Detail & Related papers (2024-05-23T14:29:15Z) - Towards true discovery of the differential equations [57.089645396998506]
Differential equation discovery is a machine learning subfield used to develop interpretable models.
This paper explores the prerequisites and tools for independent equation discovery without expert input.
arXiv Detail & Related papers (2023-08-09T12:03:12Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - Optimization on manifolds: A symplectic approach [127.54402681305629]
We propose a dissipative extension of Dirac's theory of constrained Hamiltonian systems as a general framework for solving optimization problems.
Our class of (accelerated) algorithms are not only simple and efficient but also applicable to a broad range of contexts.
arXiv Detail & Related papers (2021-07-23T13:43:34Z) - Differential Equation Based Path Integral for System-Bath Dynamics [0.0]
We propose the differential equation based path integral (DEBPI) method to simulate the real-time evolution of open quantum systems.
New numerical schemes can be derived by discretizing these differential equations.
It is numerically verified that in certain cases, by selecting appropriate systems and applying suitable numerical schemes, the memory cost required in the i-QuAPI method can be significantly reduced.
arXiv Detail & Related papers (2021-07-22T15:06:22Z) - Personalized Algorithm Generation: A Case Study in Meta-Learning ODE
Integrators [6.457555233038933]
We study the meta-learning of numerical algorithms for scientific computing.
We develop a machine learning approach that automatically learns solvers for initial value problems.
arXiv Detail & Related papers (2021-05-04T05:42:33Z) - Symbolic Regression using Mixed-Integer Nonlinear Optimization [9.638685454900047]
The Symbolic Regression (SR) problem is a hard problem in machine learning.
We propose a hybrid algorithm that combines mixed-integer nonlinear optimization with explicit enumeration.
We show that our algorithm is competitive, for some synthetic data sets, with a state-of-the-art SR software and a recent physics-inspired method called AI Feynman.
arXiv Detail & Related papers (2020-06-11T20:53:17Z) - On dissipative symplectic integration with applications to
gradient-based optimization [77.34726150561087]
We propose a geometric framework in which discretizations can be realized systematically.
We show that a generalization of symplectic to nonconservative and in particular dissipative Hamiltonian systems is able to preserve rates of convergence up to a controlled error.
arXiv Detail & Related papers (2020-04-15T00:36:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.