Exploring Physical Latent Spaces for High-Resolution Flow Restoration
- URL: http://arxiv.org/abs/2211.11298v2
- Date: Tue, 3 Oct 2023 08:55:59 GMT
- Title: Exploring Physical Latent Spaces for High-Resolution Flow Restoration
- Authors: Chloe Paliard, Nils Thuerey, Kiwon Um
- Abstract summary: We explore training deep neural network models in conjunction with physics simulations via partial differential equations (PDEs)
In contrast to previous work, this paper treats the degrees of freedom of the simulated space purely as tools to be used by the neural network.
- Score: 22.924868896246334
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We explore training deep neural network models in conjunction with physics
simulations via partial differential equations (PDEs), using the simulated
degrees of freedom as latent space for a neural network. In contrast to
previous work, this paper treats the degrees of freedom of the simulated space
purely as tools to be used by the neural network. We demonstrate this concept
for learning reduced representations, as it is extremely challenging to
faithfully preserve correct solutions over long time-spans with traditional
reduced representations, particularly for solutions with large amounts of small
scale features. This work focuses on the use of such physical, reduced latent
space for the restoration of fine simulations, by training models that can
modify the content of the reduced physical states as much as needed to best
satisfy the learning objective. This autonomy allows the neural networks to
discover alternate dynamics that significantly improve the performance in the
given tasks. We demonstrate this concept for various fluid flows ranging from
different turbulence scenarios to rising smoke plumes.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - Deep multitask neural networks for solving some stochastic optimal
control problems [0.0]
In this paper, we consider a class of optimal control problems and introduce an effective solution employing neural networks.
To train our multitask neural network, we introduce a novel scheme that dynamically balances the learning across tasks.
Through numerical experiments on real-world derivatives pricing problems, we prove that our method outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2024-01-23T17:20:48Z) - Neural Stress Fields for Reduced-order Elastoplasticity and Fracture [43.538728312264524]
We propose a hybrid neural network and physics framework for reduced-order modeling of elastoplasticity and fracture.
Key innovation is training a low-dimensional manifold for the Kirchhoff stress field via an implicit neural representation.
We demonstrate dimension reduction by up to 100,000X and time savings by up to 10X.
arXiv Detail & Related papers (2023-10-26T21:37:32Z) - NeuralClothSim: Neural Deformation Fields Meet the Thin Shell Theory [70.10550467873499]
We propose NeuralClothSim, a new quasistatic cloth simulator using thin shells.
Our memory-efficient solver operates on a new continuous coordinate-based surface representation called neural deformation fields.
arXiv Detail & Related papers (2023-08-24T17:59:54Z) - Backpropagation-free Training of Deep Physical Neural Networks [0.0]
We propose a simple deep neural network architecture augmented by a biologically plausible learning algorithm, referred to as "model-free forward-forward training"
We show that our method outperforms state-of-the-art hardware-aware training methods by improving training speed, decreasing digital computations, and reducing power consumption in physical systems.
arXiv Detail & Related papers (2023-04-20T14:02:49Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - A deep learning theory for neural networks grounded in physics [2.132096006921048]
We argue that building large, fast and efficient neural networks on neuromorphic architectures requires rethinking the algorithms to implement and train them.
Our framework applies to a very broad class of models, namely systems whose state or dynamics are described by variational equations.
arXiv Detail & Related papers (2021-03-18T02:12:48Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.