Frame-independent vector-cloud neural network for nonlocal constitutive
modelling on arbitrary grids
- URL: http://arxiv.org/abs/2103.06685v1
- Date: Thu, 11 Mar 2021 14:16:19 GMT
- Title: Frame-independent vector-cloud neural network for nonlocal constitutive
modelling on arbitrary grids
- Authors: Xu-Hui Zhou, Jiequn Han, Heng Xiao
- Abstract summary: Constitutive models are widely used for modelling complex systems in science and engineering.
We propose a frame-independent, nonlocal model based on a vector-cloud neural network that can be trained with data.
- Score: 4.168157981135698
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Constitutive models are widely used for modelling complex systems in science
and engineering, where first-principle-based, well-resolved simulations are
often prohibitively expensive. For example, in fluid dynamics, constitutive
models are required to describe nonlocal, unresolved physics such as turbulence
and laminar-turbulent transition. In particular, Reynolds stress models for
turbulence and intermittency transport equations for laminar-turbulent
transition both utilize convection--diffusion partial differential equations
(PDEs). However, traditional PDE-based constitutive models can lack robustness
and are often too rigid to accommodate diverse calibration data. We propose a
frame-independent, nonlocal constitutive model based on a vector-cloud neural
network that can be trained with data. The learned constitutive model can
predict the closure variable at a point based on the flow information in its
neighborhood. Such nonlocal information is represented by a group of points,
each having a feature vector attached to it, and thus the input is referred to
as vector cloud. The cloud is mapped to the closure variable through a
frame-independent neural network, which is invariant both to coordinate
translation and rotation and to the ordering of points in the cloud. As such,
the network takes any number of arbitrarily arranged grid points as input and
thus is suitable for unstructured meshes commonly used in fluid flow
simulations. The merits of the proposed network are demonstrated on scalar
transport PDEs on a family of parameterized periodic hill geometries. Numerical
results show that the vector-cloud neural network is a promising tool not only
as nonlocal constitutive models and but also as general surrogate models for
PDEs on irregular domains.
Related papers
- Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations [111.27636893711055]
We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
arXiv Detail & Related papers (2023-10-05T20:27:22Z) - A probabilistic, data-driven closure model for RANS simulations with aleatoric, model uncertainty [1.8416014644193066]
We propose a data-driven, closure model for Reynolds-averaged Navier-Stokes (RANS) simulations that incorporates aleatoric, model uncertainty.
A fully Bayesian formulation is proposed, combined with a sparsity-inducing prior in order to identify regions in the problem domain where the parametric closure is insufficient.
arXiv Detail & Related papers (2023-07-05T16:53:31Z) - Equivariant geometric convolutions for emulation of dynamical systems [6.3003220645859175]
We use geometric convolutions to enforce coordinate freedom in surrogate machine learning models.
In numerical experiments emulating 2D compressible Navier-Stokes, we see better accuracy and improved stability.
The ease of enforcing coordinate freedom without making major changes to the model architecture provides an exciting recipe for any CNN-based method applied to an appropriate class of problems.
arXiv Detail & Related papers (2023-05-21T22:44:18Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - The Contextual Lasso: Sparse Linear Models via Deep Neural Networks [5.607237982617641]
We develop a new statistical estimator that fits a sparse linear model to the explanatory features such that the sparsity pattern and coefficients vary as a function of the contextual features.
An extensive suite of experiments on real and synthetic data suggests that the learned models, which remain highly transparent, can be sparser than the regular lasso.
arXiv Detail & Related papers (2023-02-02T05:00:29Z) - Neural net modeling of equilibria in NSTX-U [0.0]
We develop two neural networks relevant to equilibrium and shape control modeling.
Networks include Eqnet, a free-boundary equilibrium solver trained on the EFIT01 reconstruction algorithm, and Pertnet, which is trained on the Gspert code.
We report strong performance for both networks indicating that these models could reliably be used within closed-loop simulations.
arXiv Detail & Related papers (2022-02-28T16:09:58Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Discrete Denoising Flows [87.44537620217673]
We introduce a new discrete flow-based model for categorical random variables: Discrete Denoising Flows (DDFs)
In contrast with other discrete flow-based models, our model can be locally trained without introducing gradient bias.
We show that DDFs outperform Discrete Flows on modeling a toy example, binary MNIST and Cityscapes segmentation maps, measured in log-likelihood.
arXiv Detail & Related papers (2021-07-24T14:47:22Z) - Physical invariance in neural networks for subgrid-scale scalar flux
modeling [5.333802479607541]
We present a new strategy to model the subgrid-scale scalar flux in a three-dimensional turbulent incompressible flow using physics-informed neural networks (NNs)
We show that the proposed transformation-invariant NN model outperforms both purely data-driven ones and parametric state-of-the-art subgrid-scale models.
arXiv Detail & Related papers (2020-10-09T16:09:54Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.