Neural Extended Kalman Filters for Learning and Predicting Dynamics of
Structural Systems
- URL: http://arxiv.org/abs/2210.04165v2
- Date: Mon, 3 Jul 2023 09:49:18 GMT
- Title: Neural Extended Kalman Filters for Learning and Predicting Dynamics of
Structural Systems
- Authors: Wei Liu, Zhilu Lai, Kiran Bacsa, Eleni Chatzi
- Abstract summary: We propose a learnable Extended Kalman Filter (EKF) for learning the latent evolution dynamics of complex physical systems.
Neural EKF is a generalized version of the conventional EKF, where the modeling of process dynamics and sensory observations can be parameterized by neural networks.
We show that the structure imposed by the Neural EKF is beneficial to the learning process.
- Score: 5.252966797394752
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate structural response prediction forms a main driver for structural
health monitoring and control applications. This often requires the proposed
model to adequately capture the underlying dynamics of complex structural
systems. In this work, we utilize a learnable Extended Kalman Filter (EKF),
named the Neural Extended Kalman Filter (Neural EKF) throughout this paper, for
learning the latent evolution dynamics of complex physical systems. The Neural
EKF is a generalized version of the conventional EKF, where the modeling of
process dynamics and sensory observations can be parameterized by neural
networks, therefore learned by end-to-end training. The method is implemented
under the variational inference framework with the EKF conducting inference
from sensing measurements. Typically, conventional variational inference models
are parameterized by neural networks independent of the latent dynamics models.
This characteristic makes the inference and reconstruction accuracy weakly
based on the dynamics models and renders the associated training inadequate. In
this work, we show that the structure imposed by the Neural EKF is beneficial
to the learning process. We demonstrate the efficacy of the framework on both
simulated and real-world structural monitoring datasets, with the results
indicating significant predictive capabilities of the proposed scheme.
Related papers
- Can Training Dynamics of Scale-Invariant Neural Networks Be Explained by the Thermodynamics of an Ideal Gas? [4.565724079570854]
We develop a framework to describe the stationary distributions of gradient descent (SGD) with weight decay for scale-invariant neural networks.<n>We show that key predictions of the framework, including the behavior of stationary entropy, align closely with experimental observations.
arXiv Detail & Related papers (2025-11-10T17:10:01Z) - KPFlow: An Operator Perspective on Dynamic Collapse Under Gradient Descent Training of Recurrent Networks [9.512147747894026]
We show how a gradient flow can be decomposed into a product that involves two operators.<n>We show how their interplay gives rise to low-dimensional latent dynamics under GD.<n>For multi-task training, we show that the operators can be used to measure how objectives relevant to individual sub-tasks align.
arXiv Detail & Related papers (2025-07-08T20:33:15Z) - Structured Kolmogorov-Arnold Neural ODEs for Interpretable Learning and Symbolic Discovery of Nonlinear Dynamics [3.9000699798128338]
We propose a novel framework that integrates structured state-space modeling with the Kolmogorov-Arnold Network (KAN)<n>SKANODE first employs a fully trainable KAN as a universal function approximator within a structured Neural ODE framework to perform virtual sensing.<n>We exploit the symbolic regression capability of KAN to extract compact and interpretable expressions for the system's governing dynamics.
arXiv Detail & Related papers (2025-06-23T06:42:43Z) - Generalized Factor Neural Network Model for High-dimensional Regression [50.554377879576066]
We tackle the challenges of modeling high-dimensional data sets with latent low-dimensional structures hidden within complex, non-linear, and noisy relationships.
Our approach enables a seamless integration of concepts from non-parametric regression, factor models, and neural networks for high-dimensional regression.
arXiv Detail & Related papers (2025-02-16T23:13:55Z) - Recurrent convolutional neural networks for non-adiabatic dynamics of quantum-classical systems [1.2972104025246092]
We present a RNN model based on convolutional neural networks for modeling the nonlinear non-adiabatic dynamics of hybrid quantum-classical systems.
validation studies show that the trained PARC model could reproduce the space-time evolution of a one-dimensional semi-classical Holstein model.
arXiv Detail & Related papers (2024-12-09T16:23:25Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Fuzzy Recurrent Stochastic Configuration Networks for Industrial Data Analytics [3.8719670789415925]
This paper presents a novel neuro-fuzzy model, termed fuzzy recurrent configuration networks (F-RSCNs) for industrial data analytics.
The proposed F-RSCN is constructed by multiple sub-reservoirs, and each sub-reservoir is associated with a Takagi-Sugeno-Kang (TSK) fuzzy rule.
By integrating TSK fuzzy inference systems into RSCNs, F-RSCNs have strong fuzzy inference capability and can achieve sound performance for both learning and generalization.
arXiv Detail & Related papers (2024-07-06T01:40:31Z) - eXponential FAmily Dynamical Systems (XFADS): Large-scale nonlinear Gaussian state-space modeling [9.52474299688276]
We introduce a low-rank structured variational autoencoder framework for nonlinear state-space graphical models.
We show that our approach consistently demonstrates the ability to learn a more predictive generative model.
arXiv Detail & Related papers (2024-03-03T02:19:49Z) - Enhancing Dynamical System Modeling through Interpretable Machine
Learning Augmentations: A Case Study in Cathodic Electrophoretic Deposition [0.8796261172196743]
We introduce a comprehensive data-driven framework aimed at enhancing the modeling of physical systems.
As a demonstrative application, we pursue the modeling of cathodic electrophoretic deposition (EPD), commonly known as e-coating.
arXiv Detail & Related papers (2024-01-16T14:58:21Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Physics-guided Deep Markov Models for Learning Nonlinear Dynamical
Systems with Uncertainty [6.151348127802708]
We propose a physics-guided framework, termed Physics-guided Deep Markov Model (PgDMM)
The proposed framework takes advantage of the expressive power of deep learning, while retaining the driving physics of the dynamical system.
arXiv Detail & Related papers (2021-10-16T16:35:12Z) - Cubature Kalman Filter Based Training of Hybrid Differential Equation
Recurrent Neural Network Physiological Dynamic Models [13.637931956861758]
We show how we can approximate missing ordinary differential equations with known ODEs using a neural network approximation.
Results indicate that this RBSE approach to training the NN parameters yields better outcomes (measurement/state estimation accuracy) than training the neural network with backpropagation.
arXiv Detail & Related papers (2021-10-12T15:38:13Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.