Modeling Nonlinear Dynamics in Continuous Time with Inductive Biases on
Decay Rates and/or Frequencies
- URL: http://arxiv.org/abs/2212.13033v1
- Date: Mon, 26 Dec 2022 08:08:43 GMT
- Title: Modeling Nonlinear Dynamics in Continuous Time with Inductive Biases on
Decay Rates and/or Frequencies
- Authors: Tomoharu Iwata, Yoshinobu Kawahara
- Abstract summary: We propose a neural network-based model for nonlinear dynamics in continuous time that can impose inductive biases on decay rates and frequencies.
We use neural networks to find an appropriate Koopman space, which are trained by minimizing multi-step forecasting and backcasting errors using irregularly sampled time-series data.
- Score: 37.795752939016225
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a neural network-based model for nonlinear dynamics in continuous
time that can impose inductive biases on decay rates and/or frequencies.
Inductive biases are helpful for training neural networks especially when
training data are small. The proposed model is based on the Koopman operator
theory, where the decay rate and frequency information is used by restricting
the eigenvalues of the Koopman operator that describe linear evolution in a
Koopman space. We use neural networks to find an appropriate Koopman space,
which are trained by minimizing multi-step forecasting and backcasting errors
using irregularly sampled time-series data. Experiments on various time-series
datasets demonstrate that the proposed method achieves higher forecasting
performance given a single short training sequence than the existing methods.
Related papers
- Koopman Neural Forecaster for Time Series with Temporal Distribution
Shifts [26.95428146824254]
We propose a novel deep sequence model based on the Koopman theory for time series forecasting.
Koopman Neural Forecaster (KNF) learns the linear Koopman space and the coefficients of chosen measurement functions.
We demonstrate that KNF achieves the superior performance compared to the alternatives, on multiple time series datasets.
arXiv Detail & Related papers (2022-10-07T16:33:50Z) - A Quadrature Perspective on Frequency Bias in Neural Network Training
with Nonuniform Data [1.7188280334580197]
gradient-based algorithms minimize the low-frequency misfit before reducing the high-frequency residuals.
We use the Neural Tangent Kernel (NTK) to provide a theoretically rigorous analysis for training where data are drawn from constant or piecewise-constant probability densities.
arXiv Detail & Related papers (2022-05-28T02:31:19Z) - DeepBayes -- an estimator for parameter estimation in stochastic
nonlinear dynamical models [11.917949887615567]
We propose DeepBayes estimators that leverage the power of deep recurrent neural networks in learning an estimator.
The deep recurrent neural network architectures can be trained offline and ensure significant time savings during inference.
We demonstrate the applicability of our proposed method on different example models and perform detailed comparisons with state-of-the-art approaches.
arXiv Detail & Related papers (2022-05-04T18:12:17Z) - Benign Overfitting without Linearity: Neural Network Classifiers Trained
by Gradient Descent for Noisy Linear Data [44.431266188350655]
We consider the generalization error of two-layer neural networks trained to generalize by gradient descent.
We show that neural networks exhibit benign overfitting: they can be driven to zero training error, perfectly fitting any noisy training labels, and simultaneously achieve minimax optimal test error.
In contrast to previous work on benign overfitting that require linear or kernel-based predictors, our analysis holds in a setting where both the model and learning dynamics are fundamentally nonlinear.
arXiv Detail & Related papers (2022-02-11T23:04:00Z) - Meta-Learning for Koopman Spectral Analysis with Short Time-series [49.41640137945938]
Existing methods require long time-series for training neural networks.
We propose a meta-learning method for estimating embedding functions from unseen short time-series.
We experimentally demonstrate that the proposed method achieves better performance in terms of eigenvalue estimation and future prediction.
arXiv Detail & Related papers (2021-02-09T07:19:19Z) - Neural Dynamic Mode Decomposition for End-to-End Modeling of Nonlinear
Dynamics [49.41640137945938]
We propose a neural dynamic mode decomposition for estimating a lift function based on neural networks.
With our proposed method, the forecast error is backpropagated through the neural networks and the spectral decomposition.
Our experiments demonstrate the effectiveness of our proposed method in terms of eigenvalue estimation and forecast performance.
arXiv Detail & Related papers (2020-12-11T08:34:26Z) - A Bayesian Perspective on Training Speed and Model Selection [51.15664724311443]
We show that a measure of a model's training speed can be used to estimate its marginal likelihood.
We verify our results in model selection tasks for linear models and for the infinite-width limit of deep neural networks.
Our results suggest a promising new direction towards explaining why neural networks trained with gradient descent are biased towards functions that generalize well.
arXiv Detail & Related papers (2020-10-27T17:56:14Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.