Implicit Neural Spatial Representations for Time-dependent PDEs
- URL: http://arxiv.org/abs/2210.00124v2
- Date: Wed, 31 May 2023 02:26:55 GMT
- Title: Implicit Neural Spatial Representations for Time-dependent PDEs
- Authors: Honglin Chen, Rundi Wu, Eitan Grinspun, Changxi Zheng, Peter Yichen
Chen
- Abstract summary: Implicit Neural Spatial Representation (INSR) has emerged as an effective representation of spatially-dependent vector fields.
This work explores solving time-dependent PDEs with INSR.
- Score: 29.404161110513616
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Implicit Neural Spatial Representation (INSR) has emerged as an effective
representation of spatially-dependent vector fields. This work explores solving
time-dependent PDEs with INSR. Classical PDE solvers introduce both temporal
and spatial discretizations. Common spatial discretizations include meshes and
meshless point clouds, where each degree-of-freedom corresponds to a location
in space. While these explicit spatial correspondences are intuitive to model
and understand, these representations are not necessarily optimal for accuracy,
memory usage, or adaptivity. Keeping the classical temporal discretization
unchanged (e.g., explicit/implicit Euler), we explore INSR as an alternative
spatial discretization, where spatial information is implicitly stored in the
neural network weights. The network weights then evolve over time via time
integration. Our approach does not require any training data generated by
existing solvers because our approach is the solver itself. We validate our
approach on various PDEs with examples involving large elastic deformations,
turbulent fluids, and multi-scale phenomena. While slower to compute than
traditional representations, our approach exhibits higher accuracy and lower
memory consumption. Whereas classical solvers can dynamically adapt their
spatial representation only by resorting to complex remeshing algorithms, our
INSR approach is intrinsically adaptive. By tapping into the rich literature of
classic time integrators, e.g., operator-splitting schemes, our method enables
challenging simulations in contact mechanics and turbulent flows where previous
neural-physics approaches struggle. Videos and codes are available on the
project page: http://www.cs.columbia.edu/cg/INSR-PDE/
Related papers
- PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)
We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.
PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Smooth and Sparse Latent Dynamics in Operator Learning with Jerk
Regularization [1.621267003497711]
This paper introduces a continuous operator learning framework that incorporates jagged regularization into the learning of the compressed latent space.
The framework allows for inference at any desired spatial or temporal resolution.
The effectiveness of this framework is demonstrated through a two-dimensional unsteady flow problem governed by the Navier-Stokes equations.
arXiv Detail & Related papers (2024-02-23T22:38:45Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Continuous PDE Dynamics Forecasting with Implicit Neural Representations [24.460010868042758]
We introduce a new data-driven, approach to PDEs flow with continuous-time dynamics of spatially continuous functions.
This is achieved by embedding spatial extrapolation independently of their discretization via Implicit Neural Representations.
It extrapolates at arbitrary spatial and temporal locations; it can learn sparse grids or irregular data at test time, it generalizes to new grids or resolutions.
arXiv Detail & Related papers (2022-09-29T15:17:50Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Unraveled Multilevel Transformation Networks for Predicting
Sparsely-Observed Spatiotemporal Dynamics [12.627823168264209]
We propose a model that learns to predict unknown dynamics using data from sparsely-distributed data sites.
We demonstrate the advantage of our approach using both synthetic and real-world climate data.
arXiv Detail & Related papers (2022-03-16T14:44:05Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs [8.220908558735884]
Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
arXiv Detail & Related papers (2021-06-26T22:22:19Z) - Adaptive Latent Space Tuning for Non-Stationary Distributions [62.997667081978825]
We present a method for adaptive tuning of the low-dimensional latent space of deep encoder-decoder style CNNs.
We demonstrate our approach for predicting the properties of a time-varying charged particle beam in a particle accelerator.
arXiv Detail & Related papers (2021-05-08T03:50:45Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.