Generalised Latent Assimilation in Heterogeneous Reduced Spaces with
Machine Learning Surrogate Models
- URL: http://arxiv.org/abs/2204.03497v2
- Date: Fri, 8 Apr 2022 14:46:26 GMT
- Title: Generalised Latent Assimilation in Heterogeneous Reduced Spaces with
Machine Learning Surrogate Models
- Authors: Sibo Cheng and Jianhua Chen and Charitos Anastasiou and Panagiota
Angeli and Omar K. Matar and Yi-Ke Guo and Christopher C. Pain and Rossella
Arcucci
- Abstract summary: We develop a system which combines reduced-order surrogate models with a novel data assimilation technique.
Generalised Latent Assimilation can benefit both the efficiency provided by the reduced-order modelling and the accuracy of data assimilation.
- Score: 10.410970649045943
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Reduced-order modelling and low-dimensional surrogate models generated using
machine learning algorithms have been widely applied in high-dimensional
dynamical systems to improve the algorithmic efficiency. In this paper, we
develop a system which combines reduced-order surrogate models with a novel
data assimilation (DA) technique used to incorporate real-time observations
from different physical spaces. We make use of local smooth surrogate functions
which link the space of encoded system variables and the one of current
observations to perform variational DA with a low computational cost. The new
system, named Generalised Latent Assimilation can benefit both the efficiency
provided by the reduced-order modelling and the accuracy of data assimilation.
A theoretical analysis of the difference between surrogate and original
assimilation cost function is also provided in this paper where an upper bound,
depending on the size of the local training set, is given. The new approach is
tested on a high-dimensional CFD application of a two-phase liquid flow with
non-linear observation operators that current Latent Assimilation methods can
not handle. Numerical results demonstrate that the proposed assimilation
approach can significantly improve the reconstruction and prediction accuracy
of the deep learning surrogate model which is nearly 1000 times faster than the
CFD simulation.
Related papers
- DOF: Accelerating High-order Differential Operators with Forward
Propagation [40.71528485918067]
We propose an efficient framework, Differential Operator with Forward-propagation (DOF), for calculating general second-order differential operators without losing any precision.
We demonstrate two times improvement in efficiency and reduced memory consumption on any architectures.
Empirical results illustrate that our method surpasses traditional automatic differentiation (AutoDiff) techniques, achieving 2x improvement on the structure and nearly 20x improvement on the sparsity.
arXiv Detail & Related papers (2024-02-15T05:59:21Z) - AI enhanced data assimilation and uncertainty quantification applied to
Geological Carbon Storage [0.0]
We introduce the Surrogate-based hybrid ESMDA (SH-ESMDA), an adaptation of the traditional Ensemble Smoother with Multiple Data Assimilation (ESMDA)
We also introduce Surrogate-based Hybrid RML (SH-RML), a variational data assimilation approach that relies on the randomized maximum likelihood (RML)
Our comparative analyses show that SH-RML offers better uncertainty compared to conventional ESMDA for the case study.
arXiv Detail & Related papers (2024-02-09T00:24:46Z) - Subsurface Characterization using Ensemble-based Approaches with Deep
Generative Models [2.184775414778289]
Inverse modeling is limited for ill-posed, high-dimensional applications due to computational costs and poor prediction accuracy with sparse datasets.
We combine Wasserstein Geneversarative Adrial Network with Gradient Penalty (WGAN-GP) and Ensemble Smoother with Multiple Data Assimilation (ES-MDA)
WGAN-GP is trained to generate high-dimensional K fields from a low-dimensional latent space and ES-MDA updates the latent variables by assimilating available measurements.
arXiv Detail & Related papers (2023-10-02T01:27:10Z) - An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Efficient hybrid modeling and sorption model discovery for non-linear
advection-diffusion-sorption systems: A systematic scientific machine
learning approach [0.0]
This study presents a systematic machine learning approach for creating efficient hybrid models and discovering sorption uptake models in non-fusion advection-difsorption systems.
It demonstrates an effective method to train these complex systems using gradient based analysis, adjoint sensitivity analysis, and JIT-compiled vector Jacobian products, combined with spatial discretization and adaptive discretization.
arXiv Detail & Related papers (2023-03-22T23:05:28Z) - An Accelerated Doubly Stochastic Gradient Method with Faster Explicit
Model Identification [97.28167655721766]
We propose a novel doubly accelerated gradient descent (ADSGD) method for sparsity regularized loss minimization problems.
We first prove that ADSGD can achieve a linear convergence rate and lower overall computational complexity.
arXiv Detail & Related papers (2022-08-11T22:27:22Z) - Quantized Adaptive Subgradient Algorithms and Their Applications [39.103587572626026]
We propose quantized composite mirror descent adaptive subgradient (QCMD adagrad) and quantized regularized dual average adaptive subgradient (QRDA adagrad) for distributed training.
A quantized gradient-based adaptive learning rate matrix is constructed to achieve a balance between communication costs, accuracy, and model sparsity.
arXiv Detail & Related papers (2022-08-11T04:04:03Z) - MACE: An Efficient Model-Agnostic Framework for Counterfactual
Explanation [132.77005365032468]
We propose a novel framework of Model-Agnostic Counterfactual Explanation (MACE)
In our MACE approach, we propose a novel RL-based method for finding good counterfactual examples and a gradient-less descent method for improving proximity.
Experiments on public datasets validate the effectiveness with better validity, sparsity and proximity.
arXiv Detail & Related papers (2022-05-31T04:57:06Z) - Observation Error Covariance Specification in Dynamical Systems for Data
assimilation using Recurrent Neural Networks [0.5330240017302621]
We propose a data-driven approach based on long short term memory (LSTM) recurrent neural networks (RNN)
The proposed approach does not require any knowledge or assumption about prior error distribution.
We have compared the novel approach with two state-of-the-art covariance tuning algorithms, namely DI01 and D05.
arXiv Detail & Related papers (2021-11-11T20:23:00Z) - Surrogate Models for Optimization of Dynamical Systems [0.0]
This paper provides a smart data driven mechanism to construct low dimensional surrogate models.
These surrogate models reduce the computational time for solution of the complex optimization problems.
arXiv Detail & Related papers (2021-01-22T14:09:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.