Multi-scale data reconstruction of turbulent rotating flows with Gappy
POD, Extended POD and Generative Adversarial Networks
- URL: http://arxiv.org/abs/2210.11921v2
- Date: Fri, 3 Nov 2023 19:59:35 GMT
- Title: Multi-scale data reconstruction of turbulent rotating flows with Gappy
POD, Extended POD and Generative Adversarial Networks
- Authors: Tianyi Li, Michele Buzzicotti, Luca Biferale, Fabio Bonaccorso, Shiyi
Chen and Minping Wan
- Abstract summary: In this study, we use linear and non-linear tools for reconstructing rotating turbulence snapshots with spatial damages (inpainting)
We focus on accurately reproducing both statistical properties and instantaneous velocity fields.
Surprisingly enough, concerning point-wise reconstruction, the non-linear GAN does not outperform one of the linear POD techniques.
- Score: 2.0722840259036546
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Data reconstruction of rotating turbulent snapshots is investigated utilizing
data-driven tools. This problem is crucial for numerous geophysical
applications and fundamental aspects, given the concurrent effects of direct
and inverse energy cascades, which lead to non-Gaussian statistics at both
large and small scales. Data assimilation also serves as a tool to rank
physical features within turbulence, by evaluating the performance of
reconstruction in terms of the quality and quantity of the information used.
Additionally, benchmarking various reconstruction techniques is essential to
assess the trade-off between quantitative supremacy, implementation complexity,
and explicability. In this study, we use linear and non-linear tools based on
the Proper Orthogonal Decomposition (POD) and Generative Adversarial Network
(GAN) for reconstructing rotating turbulence snapshots with spatial damages
(inpainting). We focus on accurately reproducing both statistical properties
and instantaneous velocity fields. Different gap sizes and gap geometries are
investigated in order to assess the importance of coherency and multi-scale
properties of the missing information. Surprisingly enough, concerning
point-wise reconstruction, the non-linear GAN does not outperform one of the
linear POD techniques. On the other hand, supremacy of the GAN approach is
shown when the statistical multi-scale properties are compared. Similarly,
extreme events in the gap region are better predicted when using GAN. The
balance between point-wise error and statistical properties is controlled by
the adversarial ratio, which determines the relative importance of the
generator and the discriminator in the GAN training. Robustness against the
measurement noise is also discussed.
Related papers
- Localized Gaussians as Self-Attention Weights for Point Clouds Correspondence [92.07601770031236]
We investigate semantically meaningful patterns in the attention heads of an encoder-only Transformer architecture.
We find that fixing the attention weights not only accelerates the training process but also enhances the stability of the optimization.
arXiv Detail & Related papers (2024-09-20T07:41:47Z) - MCGAN: Enhancing GAN Training with Regression-Based Generator Loss [5.7645234295847345]
An adversarial network (GAN) has emerged as a powerful tool for generating high-fidelity data.
We propose an algorithm called Monte Carlo GAN (MCGAN)
This approach, utilizing an innovative generative loss function, termly the regression loss, reformulates the generator training as a regression task.
We show that our method requires a weaker condition on the discriminator for effective generator training.
arXiv Detail & Related papers (2024-05-27T14:15:52Z) - Uncertainty-Aware Deep Attention Recurrent Neural Network for
Heterogeneous Time Series Imputation [0.25112747242081457]
Missingness is ubiquitous in multivariate time series and poses an obstacle to reliable downstream analysis.
We propose DEep Attention Recurrent Imputation (Imputation), which jointly estimates missing values and their associated uncertainty.
Experiments show that I surpasses the SOTA in diverse imputation tasks using real-world datasets.
arXiv Detail & Related papers (2024-01-04T13:21:11Z) - Generative Adversarial Networks to infer velocity components in rotating
turbulent flows [2.0873604996221946]
We show that CNN and GAN always outperform EPOD both concerning point-wise and statistical reconstructions.
The analysis is performed using both standard validation tools based on $L$ spatial distance between the prediction and the ground truth.
arXiv Detail & Related papers (2023-01-18T13:59:01Z) - BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery [97.79015388276483]
A structural equation model (SEM) is an effective framework to reason over causal relationships represented via a directed acyclic graph (DAG)
Recent advances enabled effective maximum-likelihood point estimation of DAGs from observational data.
We propose BCD Nets, a variational framework for estimating a distribution over DAGs characterizing a linear-Gaussian SEM.
arXiv Detail & Related papers (2021-12-06T03:35:21Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Disentangling Generative Factors of Physical Fields Using Variational
Autoencoders [0.0]
This work explores the use of variational autoencoders (VAEs) for non-linear dimension reduction.
A disentangled decomposition is interpretable and can be transferred to a variety of tasks including generative modeling.
arXiv Detail & Related papers (2021-09-15T16:02:43Z) - Learning Neural Causal Models with Active Interventions [83.44636110899742]
We introduce an active intervention-targeting mechanism which enables a quick identification of the underlying causal structure of the data-generating process.
Our method significantly reduces the required number of interactions compared with random intervention targeting.
We demonstrate superior performance on multiple benchmarks from simulated to real-world data.
arXiv Detail & Related papers (2021-09-06T13:10:37Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.