Physics-informed Neural Networks-based Model Predictive Control for
Multi-link Manipulators
- URL: http://arxiv.org/abs/2109.10793v1
- Date: Wed, 22 Sep 2021 15:31:24 GMT
- Title: Physics-informed Neural Networks-based Model Predictive Control for
Multi-link Manipulators
- Authors: Jonas Nicodemus, Jonas Kneifl, J\"org Fehr, Benjamin Unger
- Abstract summary: We discuss nonlinear model predictive control (NMPC) for multi-body dynamics via physics-informed machine learning methods.
We present the idea of enhancing PINNs by adding control actions and initial conditions as additional network inputs.
We present our results using our PINN-based MPC to solve a tracking problem for a complex mechanical system.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We discuss nonlinear model predictive control (NMPC) for multi-body dynamics
via physics-informed machine learning methods. Physics-informed neural networks
(PINNs) are a promising tool to approximate (partial) differential equations.
PINNs are not suited for control tasks in their original form since they are
not designed to handle variable control actions or variable initial values. We
thus present the idea of enhancing PINNs by adding control actions and initial
conditions as additional network inputs. The high-dimensional input space is
subsequently reduced via a sampling strategy and a zero-hold assumption. This
strategy enables the controller design based on a PINN as an approximation of
the underlying system dynamics. The additional benefit is that the
sensitivities are easily computed via automatic differentiation, thus leading
to efficient gradient-based algorithms. Finally, we present our results using
our PINN-based MPC to solve a tracking problem for a complex mechanical system,
a multi-link manipulator.
Related papers
- Domain-decoupled Physics-informed Neural Networks with Closed-form Gradients for Fast Model Learning of Dynamical Systems [2.8730926763860687]
Physics-informed neural networks (PINNs) are trained using physical equations and can incorporate unmodeled effects by learning from data.
We introduce the domain-decoupled physics-informed neural network (DD-PINN) to address current limitations of PINC in handling large and complex nonlinear dynamical systems.
arXiv Detail & Related papers (2024-08-27T10:54:51Z) - Dropout MPC: An Ensemble Neural MPC Approach for Systems with Learned Dynamics [0.0]
We propose a novel sampling-based ensemble neural MPC algorithm that employs the Monte-Carlo dropout technique on the learned system model.
The method aims in general at uncertain systems with complex dynamics, where models derived from first principles are hard to infer.
arXiv Detail & Related papers (2024-06-04T17:15:25Z) - Parameter-Adaptive Approximate MPC: Tuning Neural-Network Controllers without Retraining [50.00291020618743]
This work introduces a novel, parameter-adaptive AMPC architecture capable of online tuning without recomputing large datasets and retraining.
We showcase the effectiveness of parameter-adaptive AMPC by controlling the swing-ups of two different real cartpole systems with a severely resource-constrained microcontroller (MCU)
Taken together, these contributions represent a marked step toward the practical application of AMPC in real-world systems.
arXiv Detail & Related papers (2024-04-08T20:02:19Z) - Model-Based Control with Sparse Neural Dynamics [23.961218902837807]
We propose a new framework for integrated model learning and predictive control.
We show that our framework can deliver better closed-loop performance than existing state-of-the-art methods.
arXiv Detail & Related papers (2023-12-20T06:25:02Z) - GPT-PINN: Generative Pre-Trained Physics-Informed Neural Networks toward
non-intrusive Meta-learning of parametric PDEs [0.0]
We propose the Generative Pre-Trained PINN (GPT-PINN) to mitigate both challenges in the setting of parametric PDEs.
As a network of networks, its outer-/meta-network is hyper-reduced with only one hidden layer having significantly reduced number of neurons.
The meta-network adaptively learns'' the parametric dependence of the system and grows'' this hidden layer one neuron at a time.
arXiv Detail & Related papers (2023-03-27T02:22:09Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Signal Detection in MIMO Systems with Hardware Imperfections: Message
Passing on Neural Networks [101.59367762974371]
In this paper, we investigate signal detection in multiple-input-multiple-output (MIMO) communication systems with hardware impairments.
It is difficult to train a deep neural network (DNN) with limited pilot signals, hindering its practical applications.
We design an efficient message passing based Bayesian signal detector, leveraging the unitary approximate message passing (UAMP) algorithm.
arXiv Detail & Related papers (2022-10-08T04:32:58Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Physics-Informed Neural Nets-based Control [5.252190504926357]
This work presents a new framework called Physics-Informed Neural Nets-based Control (PINC)
PINC is amenable to control problems and able to simulate for longer-range time horizons that are not fixed beforehand.
We showcase our method in the control of two nonlinear dynamic systems.
arXiv Detail & Related papers (2021-04-06T14:55:23Z) - Training End-to-End Analog Neural Networks with Equilibrium Propagation [64.0476282000118]
We introduce a principled method to train end-to-end analog neural networks by gradient descent.
We show mathematically that a class of analog neural networks (called nonlinear resistive networks) are energy-based models.
Our work can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.
arXiv Detail & Related papers (2020-06-02T23:38:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.