NeuralFMU: Presenting a workflow for integrating hybrid NeuralODEs into
real world applications
- URL: http://arxiv.org/abs/2209.03933v1
- Date: Thu, 8 Sep 2022 17:17:46 GMT
- Title: NeuralFMU: Presenting a workflow for integrating hybrid NeuralODEs into
real world applications
- Authors: Tobias Thummerer, Johannes Stoljar and Lars Mikelsons
- Abstract summary: We present an intuitive workflow to setup and use NeuralFMUs.
We exemplify this concept by deploying a NeuralFMU for a consumption simulation based on a Vehicle Longitudinal Dynamics Model.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The term NeuralODE describes the structural combination of an Artifical
Neural Network (ANN) and a numerical solver for Ordinary Differential Equations
(ODEs), the former acts as the right-hand side of the ODE to be solved. This
concept was further extended by a black-box model in the form of a Functional
Mock-up Unit (FMU) to obtain a subclass of NeuralODEs, named NeuralFMUs. The
resulting structure features the advantages of first-principle and data-driven
modeling approaches in one single simulation model: A higher prediction
accuracy compared to conventional First Principle Models (FPMs), while also a
lower training effort compared to purely data-driven models. We present an
intuitive workflow to setup and use NeuralFMUs, enabling the encapsulation and
reuse of existing conventional models exported from common modeling tools.
Moreover, we exemplify this concept by deploying a NeuralFMU for a consumption
simulation based on a Vehicle Longitudinal Dynamics Model (VLDM), which is a
typical use case in automotive industry. Related challenges that are often
neglected in scientific use cases, like real measurements (e.g. noise), an
unknown system state or high-frequent discontinuities, are handled in this
contribution. For the aim to build a hybrid model with a higher prediction
quality than the original FPM, we briefly highlight two open-source libraries:
FMI.jl for integrating FMUs into the Julia programming environment, as well as
an extension to this library called FMIFlux.jl, that allows for the integration
of FMUs into a neural network topology to finally obtain a NeuralFMU.
Related papers
- The Convex Landscape of Neural Networks: Characterizing Global Optima
and Stationary Points via Lasso Models [75.33431791218302]
Deep Neural Network Network (DNN) models are used for programming purposes.
In this paper we examine the use of convex neural recovery models.
We show that all the stationary non-dimensional objective objective can be characterized as the standard a global subsampled convex solvers program.
We also show that all the stationary non-dimensional objective objective can be characterized as the standard a global subsampled convex solvers program.
arXiv Detail & Related papers (2023-12-19T23:04:56Z) - MINN: Learning the dynamics of differential-algebraic equations and
application to battery modeling [3.900623554490941]
We propose a novel architecture for generating model-integrated neural networks (MINN)
MINN allows integration on the level of learning physics-based dynamics of the system.
We apply the proposed neural network architecture to model the electrochemical dynamics of lithium-ion batteries.
arXiv Detail & Related papers (2023-04-27T09:11:40Z) - Neural Modal ODEs: Integrating Physics-based Modeling with Neural ODEs
for Modeling High Dimensional Monitored Structures [9.065343126886093]
This paper proposes a framework - termed Neural Modal ODEs - to integrate physics-based modeling with deep learning.
An autoencoder learns the abstract mappings from the first few items of observational data to the initial values of latent variables.
The decoder of the proposed model adopts the eigenmodes derived from an eigen-analysis applied to the linearized portion of a physics-based model.
arXiv Detail & Related papers (2022-07-16T09:30:20Z) - Machine Learning model for gas-liquid interface reconstruction in CFD
numerical simulations [59.84561168501493]
The volume of fluid (VoF) method is widely used in multi-phase flow simulations to track and locate the interface between two immiscible fluids.
A major bottleneck of the VoF method is the interface reconstruction step due to its high computational cost and low accuracy on unstructured grids.
We propose a machine learning enhanced VoF method based on Graph Neural Networks (GNN) to accelerate the interface reconstruction on general unstructured meshes.
arXiv Detail & Related papers (2022-07-12T17:07:46Z) - Simple lessons from complex learning: what a neural network model learns
about cosmic structure formation [7.270598539996841]
We train a neural network model to predict the full phase space evolution of cosmological N-body simulations.
Our model achieves percent level accuracy at nonlinear scales of $ksim 1 mathrmMpc-1, h$, representing a significant improvement over COLA.
arXiv Detail & Related papers (2022-06-09T15:41:09Z) - KNODE-MPC: A Knowledge-based Data-driven Predictive Control Framework
for Aerial Robots [5.897728689802829]
We make use of a deep learning tool, knowledge-based neural ordinary differential equations (KNODE), to augment a model obtained from first principles.
The resulting hybrid model encompasses both a nominal first-principle model and a neural network learnt from simulated or real-world experimental data.
To improve closed-loop performance, the hybrid model is integrated into a novel MPC framework, known as KNODE-MPC.
arXiv Detail & Related papers (2021-09-10T12:09:18Z) - NeuralFMU: Towards Structural Integration of FMUs into Neural Networks [0.0]
This paper presents a new open-source library called FMI.jl for integrating FMI into the Julia programming environment by providing the possibility to load, parameterize and simulate FMUs.
An extension to this library called FMIFlux.jl is introduced, that allows the integration of FMUs into a neural network topology to obtain a NeuralFMU.
arXiv Detail & Related papers (2021-09-09T15:42:01Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z) - Model Fusion via Optimal Transport [64.13185244219353]
We present a layer-wise model fusion algorithm for neural networks.
We show that this can successfully yield "one-shot" knowledge transfer between neural networks trained on heterogeneous non-i.i.d. data.
arXiv Detail & Related papers (2019-10-12T22:07:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.