A shallow physics-informed neural network for solving partial
differential equations on surfaces
- URL: http://arxiv.org/abs/2203.01581v1
- Date: Thu, 3 Mar 2022 09:18:21 GMT
- Title: A shallow physics-informed neural network for solving partial
differential equations on surfaces
- Authors: Wei-Fan Hu, Yi-Jun Shih, Te-Sheng Lin, Ming-Chih Lai
- Abstract summary: We introduce a mesh-free physics-informed neural network for solving partial differential equations on surfaces.
With the aid of level set function, the surface geometrical quantities, such as the normal and mean curvature of the surface, can be computed directly and used in our surface differential expressions.
With just a few hundred trainable parameters, our network model is able to achieve high predictive accuracy.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we introduce a mesh-free physics-informed neural network for
solving partial differential equations on surfaces. Based on the idea of
embedding techniques, we write the underlying surface differential equations
using conventional Cartesian differential operators. With the aid of level set
function, the surface geometrical quantities, such as the normal and mean
curvature of the surface, can be computed directly and used in our surface
differential expressions. So instead of imposing the normal extension
constraints used in literature, we take the whole Cartesian differential
expressions into account in our loss function. Meanwhile, we adopt a completely
shallow (one hidden layer) network so the present model is easy to implement
and train. We perform a series of numerical experiments on both stationary and
time-dependent partial differential equations on complicated surface
geometries. The result shows that, with just a few hundred trainable
parameters, our network model is able to achieve high predictive accuracy.
Related papers
- Explicit Neural Surfaces: Learning Continuous Geometry With Deformation
Fields [33.38609930708073]
We introduce Explicit Neural Surfaces (ENS), an efficient smooth surface representation that encodes topology with a deformation field from a known base domain.
Compared to implicit surfaces, ENS trains faster and has several orders of magnitude faster inference times.
arXiv Detail & Related papers (2023-06-05T15:24:33Z) - Theory on variational high-dimensional tensor networks [2.0307382542339485]
We investigate the emergent statistical properties of random high-dimensional-network states and the trainability of tensoral networks.
We prove that variational high-dimensional networks suffer from barren plateaus for global loss functions.
Our results pave a way for their future theoretical studies and practical applications.
arXiv Detail & Related papers (2023-03-30T15:26:30Z) - HSurf-Net: Normal Estimation for 3D Point Clouds by Learning Hyper
Surfaces [54.77683371400133]
We propose a novel normal estimation method called HSurf-Net, which can accurately predict normals from point clouds with noise and density variations.
Experimental results show that our HSurf-Net achieves the state-of-the-art performance on the synthetic shape dataset.
arXiv Detail & Related papers (2022-10-13T16:39:53Z) - Minimal Neural Atlas: Parameterizing Complex Surfaces with Minimal
Charts and Distortion [71.52576837870166]
We present Minimal Neural Atlas, a novel atlas-based explicit neural surface representation.
At its core is a fully learnable parametric domain, given by an implicit probabilistic occupancy field defined on an open square of the parametric space.
Our reconstructions are more accurate in terms of the overall geometry, due to the separation of concerns on topology and geometry.
arXiv Detail & Related papers (2022-07-29T16:55:06Z) - A Level Set Theory for Neural Implicit Evolution under Explicit Flows [102.18622466770114]
Coordinate-based neural networks parameterizing implicit surfaces have emerged as efficient representations of geometry.
We present a framework that allows applying deformation operations defined for triangle meshes onto such implicit surfaces.
We show that our approach exhibits improvements for applications like surface smoothing, mean-curvature flow, inverse rendering and user-defined editing on implicit geometry.
arXiv Detail & Related papers (2022-04-14T17:59:39Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Differential Geometry in Neural Implicits [0.6198237241838558]
We introduce a neural implicit framework that bridges discrete differential geometry of triangle meshes and continuous differential geometry of neural implicit surfaces.
It exploits the differentiable properties of neural networks and the discrete geometry of triangle meshes to approximate them as the zero-level sets of neural implicit functions.
arXiv Detail & Related papers (2022-01-23T13:40:45Z) - Least squares surface reconstruction on arbitrary domains [30.354512876068085]
We propose a new method for computing numerical derivatives based on 2D Savitzky-Golay filters and K-nearest neighbour kernels.
We show how to write both orthographic or perspective height-from-normals as a linear least squares problem using the same formulation.
We demonstrate improved performance relative to state-of-the-art on both synthetic and real data.
arXiv Detail & Related papers (2020-07-16T21:33:39Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.