Deep State Space Models for Nonlinear System Identification
- URL: http://arxiv.org/abs/2003.14162v3
- Date: Fri, 18 Jun 2021 12:34:04 GMT
- Title: Deep State Space Models for Nonlinear System Identification
- Authors: Daniel Gedon, Niklas Wahlstr\"om, Thomas B. Sch\"on, Lennart Ljung
- Abstract summary: Deep state space models (SSMs) are an actively researched model class for temporal models developed in the deep learning community.
The use of deep SSMs as a black-box identification model can describe a wide range of dynamics due to the flexibility of deep neural networks.
- Score: 0.22940141855172028
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep state space models (SSMs) are an actively researched model class for
temporal models developed in the deep learning community which have a close
connection to classic SSMs. The use of deep SSMs as a black-box identification
model can describe a wide range of dynamics due to the flexibility of deep
neural networks. Additionally, the probabilistic nature of the model class
allows the uncertainty of the system to be modelled. In this work a deep SSM
class and its parameter learning algorithm are explained in an effort to extend
the toolbox of nonlinear identification methods with a deep learning based
method. Six recent deep SSMs are evaluated in a first unified implementation on
nonlinear system identification benchmarks.
Related papers
- Recursive Learning of Asymptotic Variational Objectives [49.69399307452126]
General state-space models (SSMs) are widely used in statistical machine learning and are among the most classical generative models for sequential time-series data.
Online sequential IWAE (OSIWAE) allows for online learning of both model parameters and a Markovian recognition model for inferring latent states.
This approach is more theoretically well-founded than recently proposed online variational SMC methods.
arXiv Detail & Related papers (2024-11-04T16:12:37Z) - Deep Learning of Dynamic Systems using System Identification Toolbox(TM) [1.7537651436360189]
The toolbox offers neural state-space models which can be extended with auto-encoding features.
The toolbox contains several other enhancements that deepen its integration with the state-of-art machine learning techniques.
arXiv Detail & Related papers (2024-09-11T21:54:17Z) - Hybrid Recurrent Models Support Emergent Descriptions for Hierarchical Planning and Control [0.8749675983608172]
A class of hybrid state-space model known as recurrent switching linear dynamical systems (rSLDS) discovers meaningful behavioural units.
We propose that the rich representations formed by an rSLDS can provide useful abstractions for planning and control.
We present a novel hierarchical model-based algorithm inspired by Active Inference in which a discrete MDP sits above a low-level linear-quadratic controller.
arXiv Detail & Related papers (2024-08-20T16:02:54Z) - Towards a theory of learning dynamics in deep state space models [12.262490032020832]
State space models (SSMs) have shown remarkable empirical performance on many long sequence modeling tasks.
This work is a step toward a theory of learning dynamics in deep state space models.
arXiv Detail & Related papers (2024-07-10T00:01:56Z) - Mamba-FSCIL: Dynamic Adaptation with Selective State Space Model for Few-Shot Class-Incremental Learning [113.89327264634984]
Few-shot class-incremental learning (FSCIL) confronts the challenge of integrating new classes into a model with minimal training samples.
Traditional methods widely adopt static adaptation relying on a fixed parameter space to learn from data that arrive sequentially.
We propose a dual selective SSM projector that dynamically adjusts the projection parameters based on the intermediate features for dynamic adaptation.
arXiv Detail & Related papers (2024-07-08T17:09:39Z) - Semi-Supervised Learning of Dynamical Systems with Neural Ordinary
Differential Equations: A Teacher-Student Model Approach [10.20098335268973]
TS-NODE is the first semi-supervised approach to modeling dynamical systems with NODE.
We show significant performance improvements over a baseline Neural ODE model on multiple dynamical system modeling tasks.
arXiv Detail & Related papers (2023-10-19T19:17:12Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z) - Learning Stable Deep Dynamics Models [91.90131512825504]
We propose an approach for learning dynamical systems that are guaranteed to be stable over the entire state space.
We show that such learning systems are able to model simple dynamical systems and can be combined with additional deep generative models to learn complex dynamics.
arXiv Detail & Related papers (2020-01-17T00:04:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.