Time-Reversal Symmetric ODE Network
- URL: http://arxiv.org/abs/2007.11362v3
- Date: Thu, 7 Jan 2021 01:42:46 GMT
- Title: Time-Reversal Symmetric ODE Network
- Authors: In Huh, Eunho Yang, Sung Ju Hwang, Jinwoo Shin
- Abstract summary: Time-reversal symmetry is a fundamental property that frequently holds in classical and quantum mechanics.
We propose a novel loss function that measures how well our ordinary differential equation (ODE) networks comply with this time-reversal symmetry.
We show that, even for systems that do not possess the full time-reversal symmetry, TRS-ODENs can achieve better predictive performances over baselines.
- Score: 138.02741983098454
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time-reversal symmetry, which requires that the dynamics of a system should
not change with the reversal of time axis, is a fundamental property that
frequently holds in classical and quantum mechanics. In this paper, we propose
a novel loss function that measures how well our ordinary differential equation
(ODE) networks comply with this time-reversal symmetry; it is formally defined
by the discrepancy in the time evolutions of ODE networks between forward and
backward dynamics. Then, we design a new framework, which we name as
Time-Reversal Symmetric ODE Networks (TRS-ODENs), that can learn the dynamics
of physical systems more sample-efficiently by learning with the proposed loss
function. We evaluate TRS-ODENs on several classical dynamics, and find they
can learn the desired time evolution from observed noisy and complex
trajectories. We also show that, even for systems that do not possess the full
time-reversal symmetry, TRS-ODENs can achieve better predictive performances
over baselines.
Related papers
- TANGO: Time-Reversal Latent GraphODE for Multi-Agent Dynamical Systems [43.39754726042369]
We propose a simple-yet-effective self-supervised regularization term as a soft constraint that aligns the forward and backward trajectories predicted by a continuous graph neural network-based ordinary differential equation (GraphODE)
It effectively imposes time-reversal symmetry to enable more accurate model predictions across a wider range of dynamical systems under classical mechanics.
Experimental results on a variety of physical systems demonstrate the effectiveness of our proposed method.
arXiv Detail & Related papers (2023-10-10T08:52:16Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Physics-Informed Long Short-Term Memory for Forecasting and
Reconstruction of Chaos [5.8010446129208155]
We present the Physics-Informed Long Short-Term Memory (PI-LSTM) network to reconstruct and predict the evolution of unmeasured variables in a chaotic system.
The training is constrained by a regularization term, which penalizes solutions that violate the system's governing equations.
This work opens up new opportunities for state reconstruction and learning of the dynamics of nonlinear systems.
arXiv Detail & Related papers (2023-02-03T18:27:59Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Hierarchical Deep Learning of Multiscale Differential Equation
Time-Steppers [5.6385744392820465]
We develop a hierarchy of deep neural network time-steppers to approximate the flow map of the dynamical system over a disparate range of time-scales.
The resulting model is purely data-driven and leverages features of the multiscale dynamics.
We benchmark our algorithm against state-of-the-art methods, such as LSTM, reservoir computing, and clockwork RNN.
arXiv Detail & Related papers (2020-08-22T07:16:53Z) - Learning Continuous-Time Dynamics by Stochastic Differential Networks [32.63114111531396]
We propose a flexible continuous-time recurrent neural network named Variational Differential Networks (VSDN)
VSDN embeds the complicated dynamics of the sporadic time series by neural Differential Equations (SDE)
We show that VSDNs outperform state-of-the-art continuous-time deep learning models and achieve remarkable performance on prediction and tasks for sporadic time series.
arXiv Detail & Related papers (2020-06-11T01:40:34Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.