Tractable Dendritic RNNs for Reconstructing Nonlinear Dynamical Systems
- URL: http://arxiv.org/abs/2207.02542v1
- Date: Wed, 6 Jul 2022 09:43:03 GMT
- Title: Tractable Dendritic RNNs for Reconstructing Nonlinear Dynamical Systems
- Authors: Manuel Brenner, Florian Hess, Jonas M. Mikhaeil, Leonard Bereska,
Zahra Monfared, Po-Chen Kuo, Daniel Durstewitz
- Abstract summary: We augment a piecewise-linear recurrent neural network (RNN) by a linear spline basis expansion.
We show that this approach retains all the theoretically appealing properties of the simple PLRNN, yet boosts its capacity for approximating arbitrary nonlinear dynamical systems in comparatively low dimensions.
- Score: 7.045072177165241
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In many scientific disciplines, we are interested in inferring the nonlinear
dynamical system underlying a set of observed time series, a challenging task
in the face of chaotic behavior and noise. Previous deep learning approaches
toward this goal often suffered from a lack of interpretability and
tractability. In particular, the high-dimensional latent spaces often required
for a faithful embedding, even when the underlying dynamics lives on a
lower-dimensional manifold, can hamper theoretical analysis. Motivated by the
emerging principles of dendritic computation, we augment a dynamically
interpretable and mathematically tractable piecewise-linear (PL) recurrent
neural network (RNN) by a linear spline basis expansion. We show that this
approach retains all the theoretically appealing properties of the simple
PLRNN, yet boosts its capacity for approximating arbitrary nonlinear dynamical
systems in comparatively low dimensions. We employ two frameworks for training
the system, one combining back-propagation-through-time (BPTT) with teacher
forcing, and another based on fast and scalable variational inference. We show
that the dendritically expanded PLRNN achieves better reconstructions with
fewer parameters and dimensions on various dynamical systems benchmarks and
compares favorably to other methods, while retaining a tractable and
interpretable structure.
Related papers
- Almost-Linear RNNs Yield Highly Interpretable Symbolic Codes in Dynamical Systems Reconstruction [8.473495734873872]
We introduce Almost-Linear Recurrent Neural Networks (AL-RNNs) which automatically and robustly produce parsimonious PWL representations of Dynamical Systems (DS) from time series data.
AL-RNNs can be efficiently trained with any SOTA algorithm for dynamical systems reconstruction (DSR)
We show that for the Lorenz and R"ossler systems, AL-RNNs discover, in a purely data-driven way, the known topologically minimal PWL representations of the corresponding chaotic attractors.
arXiv Detail & Related papers (2024-10-18T07:44:12Z) - Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems [2.170477444239546]
We develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS)
Our method builds on previous work modeling the latent state evolution via a differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs)
Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics.
arXiv Detail & Related papers (2024-07-19T15:32:15Z) - Generalized Teacher Forcing for Learning Chaotic Dynamics [9.841893058953625]
Chaotic dynamical systems (DS) are ubiquitous in nature and society. Often we are interested in reconstructing such systems from observed time series for prediction or mechanistic insight.
We show on several DS that with these amendments we can reconstruct DS better than current SOTA algorithms, in much lower dimensions.
This work thus led to a simple yet powerful DS reconstruction algorithm which is highly interpretable at the same time.
arXiv Detail & Related papers (2023-06-07T13:04:34Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Expressive architectures enhance interpretability of dynamics-based
neural population models [2.294014185517203]
We evaluate the performance of sequential autoencoders (SAEs) in recovering latent chaotic attractors from simulated neural datasets.
We found that SAEs with widely-used recurrent neural network (RNN)-based dynamics were unable to infer accurate firing rates at the true latent state dimensionality.
arXiv Detail & Related papers (2022-12-07T16:44:26Z) - Learning Low Dimensional State Spaces with Overparameterized Recurrent
Neural Nets [57.06026574261203]
We provide theoretical evidence for learning low-dimensional state spaces, which can also model long-term memory.
Experiments corroborate our theory, demonstrating extrapolation via learning low-dimensional state spaces with both linear and non-linear RNNs.
arXiv Detail & Related papers (2022-10-25T14:45:15Z) - Decomposed Linear Dynamical Systems (dLDS) for learning the latent
components of neural dynamics [6.829711787905569]
We propose a new decomposed dynamical system model that represents complex non-stationary and nonlinear dynamics of time series data.
Our model is trained through a dictionary learning procedure, where we leverage recent results in tracking sparse vectors over time.
In both continuous-time and discrete-time instructional examples we demonstrate that our model can well approximate the original system.
arXiv Detail & Related papers (2022-06-07T02:25:38Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Neural Dynamic Mode Decomposition for End-to-End Modeling of Nonlinear
Dynamics [49.41640137945938]
We propose a neural dynamic mode decomposition for estimating a lift function based on neural networks.
With our proposed method, the forecast error is backpropagated through the neural networks and the spectral decomposition.
Our experiments demonstrate the effectiveness of our proposed method in terms of eigenvalue estimation and forecast performance.
arXiv Detail & Related papers (2020-12-11T08:34:26Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.