Achieving High Accuracy with PINNs via Energy Natural Gradients
- URL: http://arxiv.org/abs/2302.13163v2
- Date: Tue, 15 Aug 2023 11:48:49 GMT
- Title: Achieving High Accuracy with PINNs via Energy Natural Gradients
- Authors: Johannes M\"uller, Marius Zeinhofer
- Abstract summary: We show that the update direction in function space resulting from the energy natural gradient corresponds to the Newton direction modulo an projection onto the model's tangent space.
We demonstrate experimentally that energy natural gradient descent yields highly accurate solutions with errors several orders of magnitude smaller than what is obtained when training PINNs with standard gradient descent or Adam.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose energy natural gradient descent, a natural gradient method with
respect to a Hessian-induced Riemannian metric as an optimization algorithm for
physics-informed neural networks (PINNs) and the deep Ritz method. As a main
motivation we show that the update direction in function space resulting from
the energy natural gradient corresponds to the Newton direction modulo an
orthogonal projection onto the model's tangent space. We demonstrate
experimentally that energy natural gradient descent yields highly accurate
solutions with errors several orders of magnitude smaller than what is obtained
when training PINNs with standard optimizers like gradient descent or Adam,
even when those are allowed significantly more computation time.
Related papers
- Stochastic Gradient Descent for Gaussian Processes Done Right [86.83678041846971]
We show that when emphdone right -- by which we mean using specific insights from optimisation and kernel communities -- gradient descent is highly effective.
We introduce a emphstochastic dual descent algorithm, explain its design in an intuitive manner and illustrate the design choices.
Our method places Gaussian process regression on par with state-of-the-art graph neural networks for molecular binding affinity prediction.
arXiv Detail & Related papers (2023-10-31T16:15:13Z) - Neural Gradient Learning and Optimization for Oriented Point Normal
Estimation [53.611206368815125]
We propose a deep learning approach to learn gradient vectors with consistent orientation from 3D point clouds for normal estimation.
We learn an angular distance field based on local plane geometry to refine the coarse gradient vectors.
Our method efficiently conducts global gradient approximation while achieving better accuracy and ability generalization of local feature description.
arXiv Detail & Related papers (2023-09-17T08:35:11Z) - Stochastic Marginal Likelihood Gradients using Neural Tangent Kernels [78.6096486885658]
We introduce lower bounds to the linearized Laplace approximation of the marginal likelihood.
These bounds are amenable togradient-based optimization and allow to trade off estimation accuracy against computational complexity.
arXiv Detail & Related papers (2023-06-06T19:02:57Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Efficient Natural Gradient Descent Methods for Large-Scale Optimization
Problems [1.2891210250935146]
We propose an efficient method for computing natural gradient descent directions with respect to a generic metric in the state space.
Our technique relies on representing the natural gradient direction as a solution to a standard least-squares problem.
We can reliably compute several natural gradient descents, including the Wasserstein natural gradient parameter, for a large-scale space.
arXiv Detail & Related papers (2022-02-13T07:32:17Z) - Understanding Natural Gradient in Sobolev Spaces [15.33151811602988]
We study the natural gradient induced by Sobolevmetrics and develop several rigorous results.
Preliminary experimental results reveal the potential of this new natural gradient variant.
arXiv Detail & Related papers (2022-02-13T07:04:44Z) - Natural Gradient Optimization for Optical Quantum Circuits [4.645254587634926]
We implement Natural Gradient descent in the optical quantum circuit setting.
In particular, we adapt the Natural Gradient approach to a complex-valued parameter space.
We observe that the NG approach has a faster convergence.
arXiv Detail & Related papers (2021-06-25T14:25:52Z) - q-RBFNN:A Quantum Calculus-based RBF Neural Network [31.14412266444568]
A gradient descent based learning approach for the radial basis function neural networks (RBFNN) is proposed.
The proposed method is based on the q-gradient which is also known as Jackson derivative.
The proposed $q$-RBFNN is analyzed for its convergence performance in the context of least square algorithm.
arXiv Detail & Related papers (2021-06-02T08:27:12Z) - Channel-Directed Gradients for Optimization of Convolutional Neural
Networks [50.34913837546743]
We introduce optimization methods for convolutional neural networks that can be used to improve existing gradient-based optimization in terms of generalization error.
We show that defining the gradients along the output channel direction leads to a performance boost, while other directions can be detrimental.
arXiv Detail & Related papers (2020-08-25T00:44:09Z) - Towards Better Understanding of Adaptive Gradient Algorithms in
Generative Adversarial Nets [71.05306664267832]
Adaptive algorithms perform gradient updates using the history of gradients and are ubiquitous in training deep neural networks.
In this paper we analyze a variant of OptimisticOA algorithm for nonconcave minmax problems.
Our experiments show that adaptive GAN non-adaptive gradient algorithms can be observed empirically.
arXiv Detail & Related papers (2019-12-26T22:10:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.