FFEINR: Flow Feature-Enhanced Implicit Neural Representation for
Spatio-temporal Super-Resolution
- URL: http://arxiv.org/abs/2308.12508v2
- Date: Sun, 27 Aug 2023 02:07:26 GMT
- Title: FFEINR: Flow Feature-Enhanced Implicit Neural Representation for
Spatio-temporal Super-Resolution
- Authors: Chenyue Jiao, Chongke Bi and Lu Yang
- Abstract summary: This paper proposes a Feature-Enhanced Neural Implicit Representation (FFEINR) for super-resolution of flow field data.
It can take full advantage of the implicit neural representation in terms of model structure and sampling resolution.
The training process of FFEINR is facilitated by introducing feature enhancements for the input layer.
- Score: 4.577685231084759
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Large-scale numerical simulations are capable of generating data up to
terabytes or even petabytes. As a promising method of data reduction,
super-resolution (SR) has been widely studied in the scientific visualization
community. However, most of them are based on deep convolutional neural
networks (CNNs) or generative adversarial networks (GANs) and the scale factor
needs to be determined before constructing the network. As a result, a single
training session only supports a fixed factor and has poor generalization
ability. To address these problems, this paper proposes a Feature-Enhanced
Implicit Neural Representation (FFEINR) for spatio-temporal super-resolution of
flow field data. It can take full advantage of the implicit neural
representation in terms of model structure and sampling resolution. The neural
representation is based on a fully connected network with periodic activation
functions, which enables us to obtain lightweight models. The learned
continuous representation can decode the low-resolution flow field input data
to arbitrary spatial and temporal resolutions, allowing for flexible
upsampling. The training process of FFEINR is facilitated by introducing
feature enhancements for the input layer, which complements the contextual
information of the flow field. To demonstrate the effectiveness of the proposed
method, a series of experiments are conducted on different datasets by setting
different hyperparameters. The results show that FFEINR achieves significantly
better results than the trilinear interpolation method.
Related papers
- Flow reconstruction in time-varying geometries using graph neural networks [1.0485739694839669]
The model incorporates a feature propagation algorithm as a preprocessing step to handle extremely sparse inputs.
A binary indicator is introduced as a validity mask to distinguish between the original and propagated data points.
The model is trained on a unique data set of Direct Numerical Simulations (DNS) of a motored engine at a technically relevant operating condition.
arXiv Detail & Related papers (2024-11-13T16:49:56Z) - How to Train Neural Field Representations: A Comprehensive Study and Benchmark [29.725680049946032]
We propose a JAX-based library that leverages parallelization to enable fast optimization of datasets of neural fields.
We perform a study that investigates the effects of different hyper parameters on fitting NeFs for downstream tasks.
Based on the proposed library and our analysis, we propose Neural Field Arena, a benchmark consisting of neural field variants of popular vision datasets.
arXiv Detail & Related papers (2023-12-16T20:10:23Z) - Accelerating Scalable Graph Neural Network Inference with Node-Adaptive
Propagation [80.227864832092]
Graph neural networks (GNNs) have exhibited exceptional efficacy in a diverse array of applications.
The sheer size of large-scale graphs presents a significant challenge to real-time inference with GNNs.
We propose an online propagation framework and two novel node-adaptive propagation methods.
arXiv Detail & Related papers (2023-10-17T05:03:00Z) - Distributed Neural Representation for Reactive in situ Visualization [23.80657290203846]
Implicit neural representations (INRs) have emerged as a powerful tool for compressing large-scale volume data.
We develop a distributed neural representation and optimize it for in situ visualization.
Our technique eliminates data exchanges between processes, achieving state-of-the-art compression speed, quality and ratios.
arXiv Detail & Related papers (2023-03-28T03:55:47Z) - Efficient Graph Neural Network Inference at Large Scale [54.89457550773165]
Graph neural networks (GNNs) have demonstrated excellent performance in a wide range of applications.
Existing scalable GNNs leverage linear propagation to preprocess the features and accelerate the training and inference procedure.
We propose a novel adaptive propagation order approach that generates the personalized propagation order for each node based on its topological information.
arXiv Detail & Related papers (2022-11-01T14:38:18Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - ChiNet: Deep Recurrent Convolutional Learning for Multimodal Spacecraft
Pose Estimation [3.964047152162558]
This paper presents an innovative deep learning pipeline which estimates the relative pose of a spacecraft by incorporating the temporal information from a rendezvous sequence.
It leverages the performance of long short-term memory (LSTM) units in modelling sequences of data for the processing of features extracted by a convolutional neural network (CNN) backbone.
Three distinct training strategies, which follow a coarse-to-fine funnelled approach, are combined to facilitate feature learning and improve end-to-end pose estimation by regression.
arXiv Detail & Related papers (2021-08-23T16:48:58Z) - Motor Imagery Classification based on CNN-GRU Network with
Spatio-Temporal Feature Representation [22.488536453952964]
Recently various deep neural networks have been applied to electroencephalogram (EEG) signal.
EEG is a brain signal that can be acquired in a non-invasive way and has a high temporal resolution.
As the EEG signal has a high dimension of classification feature space, appropriate feature extraction methods are needed to improve performance.
arXiv Detail & Related papers (2021-07-15T01:05:38Z) - Rank-R FNN: A Tensor-Based Learning Model for High-Order Data
Classification [69.26747803963907]
Rank-R Feedforward Neural Network (FNN) is a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters.
First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension.
We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets.
arXiv Detail & Related papers (2021-04-11T16:37:32Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.