Meta-Learning of Neural State-Space Models Using Data From Similar
Systems
- URL: http://arxiv.org/abs/2211.07768v1
- Date: Mon, 14 Nov 2022 22:03:35 GMT
- Title: Meta-Learning of Neural State-Space Models Using Data From Similar
Systems
- Authors: Ankush Chakrabarty, Gordon Wichern, Christopher R. Laughman
- Abstract summary: We propose the use of model-agnostic meta-learning for constructing deep encoder network-based SSMs.
We demonstrate that meta-learning can result in more accurate neural SSM models than supervised- or transfer-learning.
- Score: 11.206109495578705
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural state-space models (SSMs) provide a powerful tool for modeling
dynamical systems solely using operational data. Typically, neural SSMs are
trained using data collected from the actual system under consideration,
despite the likely existence of operational data from similar systems which
have previously been deployed in the field. In this paper, we propose the use
of model-agnostic meta-learning (MAML) for constructing deep encoder
network-based SSMs, by leveraging a combination of archived data from similar
systems (used to meta-train offline) and limited data from the actual system
(used for rapid online adaptation). We demonstrate using a numerical example
that meta-learning can result in more accurate neural SSM models than
supervised- or transfer-learning, despite few adaptation steps and limited
online data. Additionally, we show that by carefully partitioning and adapting
the encoder layers while fixing the state-transition operator, we can achieve
comparable performance to MAML while reducing online adaptation complexity.
Related papers
- Reservoir computing for system identification and predictive control with limited data [3.1484174280822845]
We assess the ability of RNN variants to both learn the dynamics of benchmark control systems and serve as surrogate models for model predictive control (MPC)
We find that echo state networks (ESNs) have a variety of benefits over competing architectures, namely reductions in computational complexity, longer valid prediction times, and reductions in cost of the MPC objective function.
arXiv Detail & Related papers (2024-10-23T21:59:07Z) - Evaluating Time-Series Training Dataset through Lens of Spectrum in Deep State Space Models [16.9884076931744]
We introduce the concept of data evaluation methods used in system identification.
We propose the K-spectral metric, which is the sum of the top-K spectra of signals inside deep SSMs.
Our experiments show that the K-spectral metric has a large absolute value of the correlation coefficient with the performance.
arXiv Detail & Related papers (2024-08-29T04:46:49Z) - Automatic AI Model Selection for Wireless Systems: Online Learning via Digital Twinning [50.332027356848094]
AI-based applications are deployed at intelligent controllers to carry out functionalities like scheduling or power control.
The mapping between context and AI model parameters is ideally done in a zero-shot fashion.
This paper introduces a general methodology for the online optimization of AMS mappings.
arXiv Detail & Related papers (2024-06-22T11:17:50Z) - MPC of Uncertain Nonlinear Systems with Meta-Learning for Fast Adaptation of Neural Predictive Models [6.031205224945912]
A neural State-Space Model (NSSM) is used to approximate the nonlinear system, where a deep encoder network learns the nonlinearity from data.
This transforms the nonlinear system into a linear system in a latent space, enabling the application of model predictive control (MPC) to determine effective control actions.
arXiv Detail & Related papers (2024-04-18T11:29:43Z) - Efficient Model Adaptation for Continual Learning at the Edge [15.334881190102895]
Most machine learning (ML) systems assume stationary and matching data distributions during training and deployment.
Data distributions often shift over time due to changes in environmental factors, sensor characteristics, and task-of-interest.
This paper presents theAdaptor-Reconfigurator (EAR) framework for efficient continual learning under domain shifts.
arXiv Detail & Related papers (2023-08-03T23:55:17Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - Real-time Neural-MPC: Deep Learning Model Predictive Control for
Quadrotors and Agile Robotic Platforms [59.03426963238452]
We present Real-time Neural MPC, a framework to efficiently integrate large, complex neural network architectures as dynamics models within a model-predictive control pipeline.
We show the feasibility of our framework on real-world problems by reducing the positional tracking error by up to 82% when compared to state-of-the-art MPC approaches without neural network dynamics.
arXiv Detail & Related papers (2022-03-15T09:38:15Z) - SOLIS -- The MLOps journey from data acquisition to actionable insights [62.997667081978825]
In this paper we present a unified deployment pipeline and freedom-to-operate approach that supports all requirements while using basic cross-platform tensor framework and script language engines.
This approach however does not supply the needed procedures and pipelines for the actual deployment of machine learning capabilities in real production grade systems.
arXiv Detail & Related papers (2021-12-22T14:45:37Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Model-Based Deep Learning [155.063817656602]
Signal processing, communications, and control have traditionally relied on classical statistical modeling techniques.
Deep neural networks (DNNs) use generic architectures which learn to operate from data, and demonstrate excellent performance.
We are interested in hybrid techniques that combine principled mathematical models with data-driven systems to benefit from the advantages of both approaches.
arXiv Detail & Related papers (2020-12-15T16:29:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.