Accelerating Particle and Fluid Simulations with Differentiable Graph
Networks for Solving Forward and Inverse Problems
- URL: http://arxiv.org/abs/2309.13348v1
- Date: Sat, 23 Sep 2023 11:52:43 GMT
- Title: Accelerating Particle and Fluid Simulations with Differentiable Graph
Networks for Solving Forward and Inverse Problems
- Authors: Krishna Kumar and Yongjin Choi
- Abstract summary: We leverage physics-embedded differentiable graph network simulators to solve particulate and fluid simulations.
GNS represents the domain as a graph with particles as nodes and learned interactions as edges.
GNS achieves over 165x speedup for granular flow prediction compared to parallel CPU numerical simulations.
- Score: 2.153852088624324
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We leverage physics-embedded differentiable graph network simulators (GNS) to
accelerate particulate and fluid simulations to solve forward and inverse
problems. GNS represents the domain as a graph with particles as nodes and
learned interactions as edges. Compared to modeling global dynamics, GNS
enables learning local interaction laws through edge messages, improving its
generalization to new environments. GNS achieves over 165x speedup for granular
flow prediction compared to parallel CPU numerical simulations. We propose a
novel hybrid GNS/Material Point Method (MPM) to accelerate forward simulations
by minimizing error on a pure surrogate model by interleaving MPM in GNS
rollouts to satisfy conservation laws and minimize errors achieving 24x speedup
compared to pure numerical simulations. The differentiable GNS enables solving
inverse problems through automatic differentiation, identifying material
parameters that result in target runout distances. We demonstrate the ability
of GNS to solve inverse problems by iteratively updating the friction angle (a
material property) by computing the gradient of a loss function based on the
final and target runouts, thereby identifying the friction angle that best
matches the observed runout. The physics-embedded and differentiable simulators
open an exciting new paradigm for AI-accelerated design, control, and
optimization.
Related papers
- Inverse analysis of granular flows using differentiable graph neural network simulator [1.8231854497751137]
Inverse problems in granular flows, such as landslides and debris flows, involve estimating material parameters or boundary conditions.
Traditional high-fidelity simulators for these inverse problems are computationally demanding.
We propose a novel differentiable graph neural network simulator (GNS) by combining reverse mode automatic differentiation of graph neural networks with gradient-based optimization.
arXiv Detail & Related papers (2024-01-17T22:21:07Z) - Three-dimensional granular flow simulation using graph neural
network-based learned simulator [2.153852088624324]
We use a graph neural network (GNN) to develop a simulator for granular flows.
The simulator reproduces the overall behaviors of column collapses with various aspect ratios.
The speed of GNS outperforms high-fidelity numerical simulators by 300 times.
arXiv Detail & Related papers (2023-11-13T15:54:09Z) - Graph Neural Network-based surrogate model for granular flows [2.153852088624324]
Granular flow dynamics is crucial for assessing various geotechnical risks, including landslides and debris flows.
Traditional continuum and discrete numerical methods are limited by their computational cost in simulating large-scale systems.
We develop a graph neural network-based simulator (GNS) that takes the current state of granular flow and predicts the next state using explicit integration by learning the local interaction laws.
arXiv Detail & Related papers (2023-05-09T07:28:12Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - GNS: A generalizable Graph Neural Network-based simulator for
particulate and fluid modeling [2.132096006921048]
We develop a PyTorch-based Graph Network Simulator (GNS) that learns physics and predicts the flow behavior of particulate and fluid systems.
GNS discretizes the domain with nodes representing a collection of material points and the links connecting the nodes representing the local interaction between particles or clusters of particles.
arXiv Detail & Related papers (2022-11-18T13:28:03Z) - Transformer with Implicit Edges for Particle-based Physics Simulation [135.77656965678196]
Transformer with Implicit Edges (TIE) captures the rich semantics of particle interactions in an edge-free manner.
We evaluate our model on diverse domains of varying complexity and materials.
arXiv Detail & Related papers (2022-07-22T03:45:29Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Rotation Invariant Graph Neural Networks using Spin Convolutions [28.4962005849904]
Machine learning approaches have the potential to approximate Density Functional Theory (DFT) in a computationally efficient manner.
We introduce a novel approach to modeling angular information between sets of neighboring atoms in a graph neural network.
Results are demonstrated on the large-scale Open Catalyst 2020 dataset.
arXiv Detail & Related papers (2021-06-17T14:59:34Z) - Fast Gravitational Approach for Rigid Point Set Registration with
Ordinary Differential Equations [79.71184760864507]
This article introduces a new physics-based method for rigid point set alignment called Fast Gravitational Approach (FGA)
In FGA, the source and target point sets are interpreted as rigid particle swarms with masses interacting in a globally multiply-linked manner while moving in a simulated gravitational force field.
We show that the new method class has characteristics not found in previous alignment methods.
arXiv Detail & Related papers (2020-09-28T15:05:39Z) - Learning to Simulate Complex Physics with Graph Networks [68.43901833812448]
We present a machine learning framework and model implementation that can learn to simulate a wide variety of challenging physical domains.
Our framework---which we term "Graph Network-based Simulators" (GNS)--represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing.
Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time.
arXiv Detail & Related papers (2020-02-21T16:44:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.