Learning second order coupled differential equations that are subject to
non-conservative forces
- URL: http://arxiv.org/abs/2010.11270v2
- Date: Thu, 29 Jul 2021 11:33:52 GMT
- Title: Learning second order coupled differential equations that are subject to
non-conservative forces
- Authors: Roger Alexander M\"uller, Jonathan Laflamme-Janssen, Jaime Camacaro,
Carolina Bessega
- Abstract summary: We introduce a network that incorporates a difference approximation for the second order derivative in terms of residual connections between convolutional blocks.
We optimize this map together with the solver network, while sharing their weights, to form a powerful framework capable of learning the complex physical properties of a dissipative dynamical system.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this article we address the question whether it is possible to learn the
differential equations describing the physical properties of a dynamical
system, subject to non-conservative forces, from observations of its realspace
trajectory(ies) only. We introduce a network that incorporates a difference
approximation for the second order derivative in terms of residual connections
between convolutional blocks, whose shared weights represent the coefficients
of a second order ordinary differential equation. We further combine this
solver-like architecture with a convolutional network, capable of learning the
relation between trajectories of coupled oscillators and therefore allows us to
make a stable forecast even if the system is only partially observed. We
optimize this map together with the solver network, while sharing their
weights, to form a powerful framework capable of learning the complex physical
properties of a dissipative dynamical system.
Related papers
- Stability analysis of chaotic systems in latent spaces [4.266376725904727]
We show that a latent-space approach can infer the solution of a chaotic partial differential equation.
It can also predict the stability properties of the physical system.
arXiv Detail & Related papers (2024-10-01T08:09:14Z) - Absence of Closed-Form Descriptions for Gradient Flow in Two-Layer Narrow Networks [0.8158530638728501]
We show that the dynamics of the gradient flow in two-layer narrow networks is not an integrable system.
Under mild conditions, the identity component of the differential Galois group of the variational equations of the gradient flow is non-solvable.
This result confirms the system's non-integrability and implies that the training dynamics cannot be represented by Liouvillian functions.
arXiv Detail & Related papers (2024-08-15T17:40:11Z) - Dual symplectic classical circuits: An exactly solvable model of
many-body chaos [0.0]
We prove that two-point dynamical correlation functions are non-vanishing only along the edges of the light cones.
We test our theory in a specific family of dual-symplectic circuits, describing the dynamics of a classical Floquet spin chain.
arXiv Detail & Related papers (2023-07-04T15:48:41Z) - Initial Correlations in Open Quantum Systems: Constructing Linear
Dynamical Maps and Master Equations [62.997667081978825]
We show that, for any predetermined initial correlations, one can introduce a linear dynamical map on the space of operators of the open system.
We demonstrate that this construction leads to a linear, time-local quantum master equation with generalized Lindblad structure.
arXiv Detail & Related papers (2022-10-24T13:43:04Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Structural Inference of Networked Dynamical Systems with Universal
Differential Equations [2.4231435999251927]
Networked dynamical systems are common throughout science in engineering.
We seek to infer (i) the intrinsic physics of a base unit of a population, (ii) the underlying graphical structure shared between units, and (iii) the coupling physics of a given networked dynamical system.
arXiv Detail & Related papers (2022-07-11T15:40:53Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Continuous and time-discrete non-Markovian system-reservoir
interactions: Dissipative coherent quantum feedback in Liouville space [62.997667081978825]
We investigate a quantum system simultaneously exposed to two structured reservoirs.
We employ a numerically exact quasi-2D tensor network combining both diagonal and off-diagonal system-reservoir interactions with a twofold memory for continuous and discrete retardation effects.
As a possible example, we study the non-Markovian interplay between discrete photonic feedback and structured acoustic phononovian modes, resulting in emerging inter-reservoir correlations and long-living population trapping within an initially-excited two-level system.
arXiv Detail & Related papers (2020-11-10T12:38:35Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z) - The interplay between local and non-local master equations: exact and
approximated dynamics [0.0]
We derive an exact connection between the time-local and the integro-differential descriptions, focusing on the class of commutative dynamics.
We investigate a Redfield-like approximation, that transforms the exact integro-differential equation into a time-local one by means of a coarse graining in time.
arXiv Detail & Related papers (2020-01-31T16:55:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.