Lie Point Symmetry and Physics Informed Networks
- URL: http://arxiv.org/abs/2311.04293v1
- Date: Tue, 7 Nov 2023 19:07:16 GMT
- Title: Lie Point Symmetry and Physics Informed Networks
- Authors: Tara Akhound-Sadegh, Laurence Perreault-Levasseur, Johannes
Brandstetter, Max Welling, Siamak Ravanbakhsh
- Abstract summary: We propose a loss function that informs the network about Lie point symmetries in the same way that PINN models try to enforce the underlying PDE through a loss function.
Our symmetry loss ensures that the infinitesimal generators of the Lie group conserve the PDE solutions.
Empirical evaluations indicate that the inductive bias introduced by the Lie point symmetries of the PDEs greatly boosts the sample efficiency of PINNs.
- Score: 59.56218517113066
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Symmetries have been leveraged to improve the generalization of neural
networks through different mechanisms from data augmentation to equivariant
architectures. However, despite their potential, their integration into neural
solvers for partial differential equations (PDEs) remains largely unexplored.
We explore the integration of PDE symmetries, known as Lie point symmetries, in
a major family of neural solvers known as physics-informed neural networks
(PINNs). We propose a loss function that informs the network about Lie point
symmetries in the same way that PINN models try to enforce the underlying PDE
through a loss function. Intuitively, our symmetry loss ensures that the
infinitesimal generators of the Lie group conserve the PDE solutions.
Effectively, this means that once the network learns a solution, it also learns
the neighbouring solutions generated by Lie point symmetries. Empirical
evaluations indicate that the inductive bias introduced by the Lie point
symmetries of the PDEs greatly boosts the sample efficiency of PINNs.
Related papers
- The Role of Fibration Symmetries in Geometric Deep Learning [0.0]
Geometric Deep Learning (GDL) unifies a broad class of machine learning techniques from the perspectives of symmetries.
We propose to relax GDL to allow for local symmetries, specifically fibration symmetries in graphs, to leverage regularities of realistic instances.
GNNs apply the inductive bias of fibration symmetries and derive a tighter upper bound for their expressive power.
arXiv Detail & Related papers (2024-08-28T16:04:40Z) - The Empirical Impact of Neural Parameter Symmetries, or Lack Thereof [50.49582712378289]
We investigate the impact of neural parameter symmetries by introducing new neural network architectures.
We develop two methods, with some provable guarantees, of modifying standard neural networks to reduce parameter space symmetries.
Our experiments reveal several interesting observations on the empirical impact of parameter symmetries.
arXiv Detail & Related papers (2024-05-30T16:32:31Z) - Symmetry group based domain decomposition to enhance physics-informed neural networks for solving partial differential equations [3.3360424430642848]
We propose a symmetry group based domain decomposition strategy to enhance the PINN for solving the forward and inverse problems of the PDEs possessing a Lie symmetry group.
For the forward problem, we first deploy the symmetry group to generate the dividing-lines having known solution information which can be adjusted flexibly.
We then utilize the PINN and the symmetry-enhanced PINN methods to learn the solutions in each sub-domain and finally stitch them to the overall solution of PDEs.
arXiv Detail & Related papers (2024-04-29T09:27:17Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Adaptive Log-Euclidean Metrics for SPD Matrix Learning [73.12655932115881]
We propose Adaptive Log-Euclidean Metrics (ALEMs), which extend the widely used Log-Euclidean Metric (LEM)
The experimental and theoretical results demonstrate the merit of the proposed metrics in improving the performance of SPD neural networks.
arXiv Detail & Related papers (2023-03-26T18:31:52Z) - CP-PINNs: Data-Driven Changepoints Detection in PDEs Using Online Optimized Physics-Informed Neural Networks [0.0]
We investigate the inverse problem for Partial Differential Equations (PDEs) in scenarios where the parameters of the given PDE dynamics may exhibit changepoints at random time.
We employ Physics-Informed Neural Networks (PINNs) - universal approximators capable of estimating the solution of any physical law.
We propose a PINNs extension using a Total-Variation penalty, which allows to accommodate multiple changepoints in the PDE dynamics.
arXiv Detail & Related papers (2022-08-18T04:01:07Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - Mean-field Analysis of Piecewise Linear Solutions for Wide ReLU Networks [83.58049517083138]
We consider a two-layer ReLU network trained via gradient descent.
We show that SGD is biased towards a simple solution.
We also provide empirical evidence that knots at locations distinct from the data points might occur.
arXiv Detail & Related papers (2021-11-03T15:14:20Z) - DiffNet: Neural Field Solutions of Parametric Partial Differential
Equations [30.80582606420882]
We consider a mesh-based approach for training a neural network to produce field predictions of solutions to PDEs.
We use a weighted Galerkin loss function based on the Finite Element Method (FEM) on a parametric elliptic PDE.
We prove theoretically, and illustrate with experiments, convergence results analogous to mesh convergence analysis deployed in finite element solutions to PDEs.
arXiv Detail & Related papers (2021-10-04T17:59:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.