Thermal Neural Networks: Lumped-Parameter Thermal Modeling With
State-Space Machine Learning
- URL: http://arxiv.org/abs/2103.16323v1
- Date: Tue, 30 Mar 2021 13:15:48 GMT
- Title: Thermal Neural Networks: Lumped-Parameter Thermal Modeling With
State-Space Machine Learning
- Authors: Wilhelm Kirchg\"assner, Oliver Wallscheid, Joachim B\"ocker
- Abstract summary: Thermal models for electric power systems are required to be both, real-time capable and of high estimation accuracy.
In this work, the thermal neural network (TNN) is introduced, which unifies both, consolidated knowledge in the form of heat-transfer-based lumped- parameter models.
A TNN has physically interpretable states through its state-space representation, is end-to-end trainable, and requires no material, geometry, nor expert knowledge for its design.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With electric power systems becoming more compact and increasingly powerful,
the relevance of thermal stress especially during overload operation is
expected to increase ceaselessly. Whenever critical temperatures cannot be
measured economically on a sensor base, a thermal model lends itself to
estimate those unknown quantities. Thermal models for electric power systems
are usually required to be both, real-time capable and of high estimation
accuracy. Moreover, ease of implementation and time to production play an
increasingly important role. In this work, the thermal neural network (TNN) is
introduced, which unifies both, consolidated knowledge in the form of
heat-transfer-based lumped-parameter models, and data-driven nonlinear function
approximation with supervised machine learning. A quasi-linear
parameter-varying system is identified solely from empirical data, where
relationships between scheduling variables and system matrices are inferred
statistically and automatically. At the same time, a TNN has physically
interpretable states through its state-space representation, is end-to-end
trainable -- similar to deep learning models -- with automatic differentiation,
and requires no material, geometry, nor expert knowledge for its design.
Experiments on an electric motor data set show that a TNN achieves higher
temperature estimation accuracies than previous white-/grey- or black-box
models with a mean squared error of $3.18~\text{K}^2$ and a worst-case error of
$5.84~\text{K}$ at 64 model parameters.
Related papers
- Gridded Transformer Neural Processes for Large Unstructured Spatio-Temporal Data [47.14384085714576]
We introduce gridded pseudo-tokenPs to handle unstructured observations and a processor containing gridded pseudo-tokens that leverage efficient attention mechanisms.
Our method consistently outperforms a range of strong baselines on various synthetic and real-world regression tasks involving large-scale data.
The real-life experiments are performed on weather data, demonstrating the potential of our approach to bring performance and computational benefits when applied at scale in a weather modelling pipeline.
arXiv Detail & Related papers (2024-10-09T10:00:56Z) - Physics-Informed Machine Learning Towards A Real-Time Spacecraft Thermal Simulator [15.313871831214902]
The PIML model or hybrid model presented here consists of a neural network which predicts reduced nodalizations given on-orbit thermal load conditions.
We compare the computational performance and accuracy of the hybrid model to a data-driven neural net model, and a high-fidelity finite-difference model of a prototype Earth-orbiting small spacecraft.
The PIML based active nodalization approach provides significantly better generalization than the neural net model and coarse mesh model, while reducing computing cost by up to 1.7x compared to the high-fidelity model.
arXiv Detail & Related papers (2024-07-08T16:38:52Z) - Towards Physically Consistent Deep Learning For Climate Model Parameterizations [46.07009109585047]
parameterizations are a major source of systematic errors and large uncertainties in climate projections.
Deep learning (DL)-based parameterizations, trained on data from computationally expensive short, high-resolution simulations, have shown great promise for improving climate models.
We propose an efficient supervised learning framework for DL-based parameterizations that leads to physically consistent models.
arXiv Detail & Related papers (2024-06-06T10:02:49Z) - Training Deep Surrogate Models with Large Scale Online Learning [48.7576911714538]
Deep learning algorithms have emerged as a viable alternative for obtaining fast solutions for PDEs.
Models are usually trained on synthetic data generated by solvers, stored on disk and read back for training.
It proposes an open source online training framework for deep surrogate models.
arXiv Detail & Related papers (2023-06-28T12:02:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Hybrid full-field thermal characterization of additive manufacturing
processes using physics-informed neural networks with data [5.653328302363391]
We develop a hybrid physics-based data-driven thermal modeling approach of AM processes using physics-informed neural networks.
Partially observed temperature data measured from an infrared camera is combined with the physics laws to predict full-field temperature history.
Results show that the hybrid thermal model can effectively identify unknown parameters and capture the full-field temperature accurately.
arXiv Detail & Related papers (2022-06-15T18:27:10Z) - A physics and data co-driven surrogate modeling approach for temperature
field prediction on irregular geometric domain [12.264200001067797]
We propose a novel physics and data co-driven surrogate modeling method for temperature field prediction.
Numerical results demonstrate that our method can significantly improve accuracy prediction on a smaller dataset.
arXiv Detail & Related papers (2022-03-15T08:43:24Z) - On Energy-Based Models with Overparametrized Shallow Neural Networks [44.74000986284978]
Energy-based models (EBMs) are a powerful framework for generative modeling.
In this work we focus on shallow neural networks.
We show that models trained in the so-called "active" regime provide a statistical advantage over their associated "lazy" or kernel regime.
arXiv Detail & Related papers (2021-04-15T15:34:58Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Thermodynamics-based Artificial Neural Networks for constitutive
modeling [0.0]
We propose a new class of data-driven, physics-based, neural networks for modeling of strain rate independent processes at the material point level.
The two basic principles of thermodynamics are encoded in the network's architecture by taking advantage of automatic differentiation.
We demonstrate the wide applicability of TANNs for modeling elasto-plastic materials, with strain hardening and softening strain.
arXiv Detail & Related papers (2020-05-25T15:56:34Z) - Data-Driven Permanent Magnet Temperature Estimation in Synchronous
Motors with Supervised Machine Learning [0.0]
Monitoring the magnet temperature in permanent magnet synchronous motors (PMSMs) for automotive applications is a challenging task.
Overheating results in severe motor deterioration and is thus of high concern for the machine's control strategy and its design.
Several machine learning (ML) models are empirically evaluated on their estimation accuracy for the task of predicting latent high-dynamic magnet temperature profiles.
arXiv Detail & Related papers (2020-01-17T11:41:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.