Physics-informed Deep Super-resolution for Spatiotemporal Data
- URL: http://arxiv.org/abs/2208.01462v1
- Date: Tue, 2 Aug 2022 13:57:35 GMT
- Title: Physics-informed Deep Super-resolution for Spatiotemporal Data
- Authors: Pu Ren, Chengping Rao, Yang Liu, Zihan Ma, Qi Wang, Jian-Xun Wang, Hao
Sun
- Abstract summary: Deep learning can be used to augment scientific data based on coarse-grained simulations.
We propose a rich and efficient temporal super-resolution framework inspired by physics-informed learning.
Results demonstrate the superior effectiveness and efficiency of the proposed method compared with baseline algorithms.
- Score: 18.688475686901082
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: High-fidelity simulation of complex physical systems is exorbitantly
expensive and inaccessible across spatiotemporal scales. Recently, there has
been an increasing interest in leveraging deep learning to augment scientific
data based on the coarse-grained simulations, which is of cheap computational
expense and retains satisfactory solution accuracy. However, the major existing
work focuses on data-driven approaches which rely on rich training datasets and
lack sufficient physical constraints. To this end, we propose a novel and
efficient spatiotemporal super-resolution framework via physics-informed
learning, inspired by the independence between temporal and spatial derivatives
in partial differential equations (PDEs). The general principle is to leverage
the temporal interpolation for flow estimation, and then introduce
convolutional-recurrent neural networks for learning temporal refinement.
Furthermore, we employ the stacked residual blocks with wide activation and
sub-pixel layers with pixelshuffle for spatial reconstruction, where feature
extraction is conducted in a low-resolution latent space. Moreover, we consider
hard imposition of boundary conditions in the network to improve reconstruction
accuracy. Results demonstrate the superior effectiveness and efficiency of the
proposed method compared with baseline algorithms through extensive numerical
experiments.
Related papers
- Optimal Transport-Based Displacement Interpolation with Data Augmentation for Reduced Order Modeling of Nonlinear Dynamical Systems [0.0]
We present a novel reduced-order Model (ROM) that exploits optimal transport theory and displacement to enhance the representation of nonlinear dynamics in complex systems.
We show improved accuracy and efficiency in predicting complex system behaviors, indicating the potential of this approach for a wide range of applications in computational physics and engineering.
arXiv Detail & Related papers (2024-11-13T16:29:33Z) - Super-Resolution works for coastal simulations [6.263499279406057]
High-resolution simulations are necessary to advance understanding of many processes, specifically, to predict flooding from tsunamis and storm surges.
We propose a Deep Network for Super-resolution enhancement to efficiently learn high-resolution numerical solutions.
Our method shows superior super-resolution quality and fast computation compared to the state-of-the-art methods.
arXiv Detail & Related papers (2024-08-29T14:16:13Z) - Continuous Field Reconstruction from Sparse Observations with Implicit
Neural Networks [11.139052252214917]
This work presents a novel approach that learns a continuous representation of a physical field using implicit neural representations.
In experimental evaluations, the proposed model outperforms recent INR methods.
arXiv Detail & Related papers (2024-01-21T22:18:29Z) - Neural Network with Local Converging Input (NNLCI) for Supersonic Flow
Problems with Unstructured Grids [0.9152133607343995]
We develop a neural network with local converging input (NNLCI) for high-fidelity prediction using unstructured data.
As a validation case, the NNLCI method is applied to study inviscid supersonic flows in channels with bumps.
arXiv Detail & Related papers (2023-10-23T19:03:37Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - On Fast Simulation of Dynamical System with Neural Vector Enhanced
Numerical Solver [59.13397937903832]
We introduce a deep learning-based corrector called Neural Vector (NeurVec)
NeurVec can compensate for integration errors and enable larger time step sizes in simulations.
Our experiments on a variety of complex dynamical system benchmarks demonstrate that NeurVec exhibits remarkable generalization capability.
arXiv Detail & Related papers (2022-08-07T09:02:18Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Influence Estimation and Maximization via Neural Mean-Field Dynamics [60.91291234832546]
We propose a novel learning framework using neural mean-field (NMF) dynamics for inference and estimation problems.
Our framework can simultaneously learn the structure of the diffusion network and the evolution of node infection probabilities.
arXiv Detail & Related papers (2021-06-03T00:02:05Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.