Physics-integrated generative modeling using attentive planar normalizing flow based variational autoencoder
- URL: http://arxiv.org/abs/2404.12267v1
- Date: Thu, 18 Apr 2024 15:38:14 GMT
- Title: Physics-integrated generative modeling using attentive planar normalizing flow based variational autoencoder
- Authors: Sheikh Waqas Akhtar,
- Abstract summary: We aim to improve the fidelity of reconstruction and to noise in the physics integrated generative model.
To improve the robustness of generative model against noise injected in the model, we propose a modification in the encoder part of the normalizing flow based VAE.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physics-integrated generative modeling is a class of hybrid or grey-box modeling in which we augment the the data-driven model with the physics knowledge governing the data distribution. The use of physics knowledge allows the generative model to produce output in a controlled way, so that the output, by construction, complies with the physical laws. It imparts improved generalization ability to extrapolate beyond the training distribution as well as improved interpretability because the model is partly grounded in firm domain knowledge. In this work, we aim to improve the fidelity of reconstruction and robustness to noise in the physics integrated generative model. To this end, we use variational-autoencoder as a generative model. To improve the reconstruction results of the decoder, we propose to learn the latent posterior distribution of both the physics as well as the trainable data-driven components using planar normalizng flow. Normalizng flow based posterior distribution harnesses the inherent dynamical structure of the data distribution, hence the learned model gets closer to the true underlying data distribution. To improve the robustness of generative model against noise injected in the model, we propose a modification in the encoder part of the normalizing flow based VAE. We designed the encoder to incorporate scaled dot product attention based contextual information in the noisy latent vector which will mitigate the adverse effect of noise in the latent vector and make the model more robust. We empirically evaluated our models on human locomotion dataset [33] and the results validate the efficacy of our proposed models in terms of improvement in reconstruction quality as well as robustness against noise injected in the model.
Related papers
- Generating Synthetic Net Load Data with Physics-informed Diffusion Model [0.8848340429852071]
A conditional denoising neural network is designed to jointly train the parameters of the transition kernel of the diffusion model.
A comprehensive set of evaluation metrics is used to assess the accuracy and diversity of the generated synthetic net load data.
arXiv Detail & Related papers (2024-06-04T02:50:19Z) - Data-driven Nonlinear Model Reduction using Koopman Theory: Integrated
Control Form and NMPC Case Study [56.283944756315066]
We propose generic model structures combining delay-coordinate encoding of measurements and full-state decoding to integrate reduced Koopman modeling and state estimation.
A case study demonstrates that our approach provides accurate control models and enables real-time capable nonlinear model predictive control of a high-purity cryogenic distillation column.
arXiv Detail & Related papers (2024-01-09T11:54:54Z) - A Physics-informed Diffusion Model for High-fidelity Flow Field
Reconstruction [0.0]
We propose a diffusion model which only uses high-fidelity data at training.
With different configurations, our model is able to reconstruct high-fidelity data from either a regular low-fidelity sample or a sparsely measured sample.
Our model can produce accurate reconstruction results for 2d turbulent flows based on different input sources without retraining.
arXiv Detail & Related papers (2022-11-26T23:14:18Z) - TrafficFlowGAN: Physics-informed Flow based Generative Adversarial
Network for Uncertainty Quantification [4.215251065887861]
We propose TrafficFlowGAN, a physics-informed flow based generative adversarial network (GAN) for uncertainty quantification (UQ) of dynamical systems.
This flow model is trained to maximize the data likelihood and to generate synthetic data that can fool a convolutional discriminator.
To the best of our knowledge, we are the first to propose an integration of flow, GAN and PIDL for the UQ problems.
arXiv Detail & Related papers (2022-06-19T03:35:12Z) - On the Generalization and Adaption Performance of Causal Models [99.64022680811281]
Differentiable causal discovery has proposed to factorize the data generating process into a set of modules.
We study the generalization and adaption performance of such modular neural causal models.
Our analysis shows that the modular neural causal models outperform other models on both zero and few-shot adaptation in low data regimes.
arXiv Detail & Related papers (2022-06-09T17:12:32Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Uncertainty quantification and inverse modeling for subsurface flow in
3D heterogeneous formations using a theory-guided convolutional
encoder-decoder network [5.018057056965207]
We build surrogate models for dynamic 3D subsurface single-phase flow problems with multiple vertical producing wells.
The surrogate model provides efficient pressure estimation of the entire formation at any timestep.
The well production rate or bottom hole pressure can then be determined based on Peaceman's formula.
arXiv Detail & Related papers (2021-11-14T10:11:46Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.