Learning Physics between Digital Twins with Low-Fidelity Models and
Physics-Informed Gaussian Processes
- URL: http://arxiv.org/abs/2206.08201v2
- Date: Tue, 2 May 2023 13:53:42 GMT
- Title: Learning Physics between Digital Twins with Low-Fidelity Models and
Physics-Informed Gaussian Processes
- Authors: Michail Spitieris and Ingelin Steinsland
- Abstract summary: We introduce a fully Bayesian methodology for learning between digital twins in a setting where the physical parameters of each individual are of interest.
A model discrepancy term is incorporated in the model formulation of each personalized model to account for the missing physics of the low-fidelity model.
Case studies show that 1) models not accounting for imperfect physical models are biased and over-confident, 2) the models accounting for imperfect physical models are more uncertain but cover the truth, and 3) the models learning between digital twins have less uncertainty than the corresponding independent individual models, but are not over-confident.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A digital twin is a computer model that represents an individual, for
example, a component, a patient or a process. In many situations, we want to
gain knowledge about an individual from its data while incorporating imperfect
physical knowledge and also learn from data from other individuals. In this
paper, we introduce a fully Bayesian methodology for learning between digital
twins in a setting where the physical parameters of each individual are of
interest. A model discrepancy term is incorporated in the model formulation of
each personalized model to account for the missing physics of the low-fidelity
model. To allow sharing of information between individuals, we introduce a
Bayesian Hierarchical modelling framework where the individual models are
connected through a new level in the hierarchy. Our methodology is demonstrated
in two case studies, a toy example previously used in the literature extended
to more individuals and a cardiovascular model relevant for the treatment of
Hypertension. The case studies show that 1) models not accounting for imperfect
physical models are biased and over-confident, 2) the models accounting for
imperfect physical models are more uncertain but cover the truth, 3) the models
learning between digital twins have less uncertainty than the corresponding
independent individual models, but are not over-confident.
Related papers
- HyPer-EP: Meta-Learning Hybrid Personalized Models for Cardiac Electrophysiology [7.230055455268642]
We present a novel hybrid modeling framework to describe a personalized cardiac digital twin.
We then present a novel meta-learning framework to enable the separate identification of both the physics-based and neural components.
arXiv Detail & Related papers (2024-03-15T02:30:00Z) - Fantastic Gains and Where to Find Them: On the Existence and Prospect of
General Knowledge Transfer between Any Pretrained Model [74.62272538148245]
We show that for arbitrary pairings of pretrained models, one model extracts significant data context unavailable in the other.
We investigate if it is possible to transfer such "complementary" knowledge from one model to another without performance degradation.
arXiv Detail & Related papers (2023-10-26T17:59:46Z) - Combining General and Personalized Models for Epilepsy Detection with
Hyperdimensional Computing [4.538319875483978]
Epilepsy is a chronic neurological disorder with a significant prevalence.
There is still no adequate technological support to enable epilepsy detection and continuous outpatient monitoring in everyday life.
In this work, we demonstrate a few additional aspects in which HD computing, and the way its models are built and stored, can be used for further understanding, comparing, and creating more advanced machine learning models for epilepsy detection.
arXiv Detail & Related papers (2023-03-26T14:51:25Z) - Dataless Knowledge Fusion by Merging Weights of Language Models [51.8162883997512]
Fine-tuning pre-trained language models has become the prevalent paradigm for building downstream NLP models.
This creates a barrier to fusing knowledge across individual models to yield a better single model.
We propose a dataless knowledge fusion method that merges models in their parameter space.
arXiv Detail & Related papers (2022-12-19T20:46:43Z) - Are Neural Topic Models Broken? [81.15470302729638]
We study the relationship between automated and human evaluation of topic models.
We find that neural topic models fare worse in both respects compared to an established classical method.
arXiv Detail & Related papers (2022-10-28T14:38:50Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Synthetic Model Combination: An Instance-wise Approach to Unsupervised
Ensemble Learning [92.89846887298852]
Consider making a prediction over new test data without any opportunity to learn from a training set of labelled data.
Give access to a set of expert models and their predictions alongside some limited information about the dataset used to train them.
arXiv Detail & Related papers (2022-10-11T10:20:31Z) - PhysiNet: A Combination of Physics-based Model and Neural Network Model
for Digital Twins [0.5076419064097732]
This paper proposes a model that combines the physics-based model and the neural network model to improve the prediction accuracy for the whole life cycle of a system.
Experiments showed that the proposed hybrid model outperformed both the physics-based model and the neural network model.
arXiv Detail & Related papers (2021-06-28T15:13:16Z) - Learning physically consistent mathematical models from data using group
sparsity [2.580765958706854]
In areas like biology, high noise levels, sensor-induced correlations, and strong inter-system variability can render data-driven models nonsensical or physically inconsistent.
We show several applications from systems biology that demonstrate the benefits of enforcing $textitpriors$ in data-driven modeling.
arXiv Detail & Related papers (2020-12-11T14:45:38Z) - Hybrid modeling: Applications in real-time diagnosis [64.5040763067757]
We outline a novel hybrid modeling approach that combines machine learning inspired models and physics-based models.
We are using such models for real-time diagnosis applications.
arXiv Detail & Related papers (2020-03-04T00:44:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.