TT-PINN: A Tensor-Compressed Neural PDE Solver for Edge Computing
- URL: http://arxiv.org/abs/2207.01751v1
- Date: Mon, 4 Jul 2022 23:56:27 GMT
- Title: TT-PINN: A Tensor-Compressed Neural PDE Solver for Edge Computing
- Authors: Ziyue Liu, Xinling Yu, Zheng Zhang
- Abstract summary: Physics-informed neural networks (PINNs) have been increasingly employed due to their capability of modeling complex physics systems.
This paper proposes an end-to-end compressed PINN based on Helmholtz-Train decomposition.
- Score: 7.429526302331948
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) have been increasingly employed due
to their capability of modeling complex physics systems. To achieve better
expressiveness, increasingly larger network sizes are required in many
problems. This has caused challenges when we need to train PINNs on edge
devices with limited memory, computing and energy resources. To enable training
PINNs on edge devices, this paper proposes an end-to-end compressed PINN based
on Tensor-Train decomposition. In solving a Helmholtz equation, our proposed
model significantly outperforms the original PINNs with few parameters and
achieves satisfactory prediction with up to 15$\times$ overall parameter
reduction.
Related papers
- iPINNs: Incremental learning for Physics-informed neural networks [66.4795381419701]
Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs)
We propose incremental PINNs that can learn multiple tasks sequentially without additional parameters for new tasks and improve performance for every equation in the sequence.
Our approach learns multiple PDEs starting from the simplest one by creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learnedworks.
arXiv Detail & Related papers (2023-04-10T20:19:20Z) - GPT-PINN: Generative Pre-Trained Physics-Informed Neural Networks toward
non-intrusive Meta-learning of parametric PDEs [0.0]
We propose the Generative Pre-Trained PINN (GPT-PINN) to mitigate both challenges in the setting of parametric PDEs.
As a network of networks, its outer-/meta-network is hyper-reduced with only one hidden layer having significantly reduced number of neurons.
The meta-network adaptively learns'' the parametric dependence of the system and grows'' this hidden layer one neuron at a time.
arXiv Detail & Related papers (2023-03-27T02:22:09Z) - AutoPINN: When AutoML Meets Physics-Informed Neural Networks [30.798918516407376]
PINNs enable the estimation of critical parameters, which are unobservable via physical tools, through observable variables.
Existing PINNs are often manually designed, which is time-consuming and may lead to suboptimal performance.
We propose a framework that enables the automated design of PINNs by combining AutoML and PINNs.
arXiv Detail & Related papers (2022-12-08T03:44:08Z) - Separable PINN: Mitigating the Curse of Dimensionality in
Physics-Informed Neural Networks [6.439575695132489]
Physics-informed neural networks (PINNs) have emerged as new data-driven PDE solvers for both forward and inverse problems.
We demonstrate that the computations in automatic differentiation (AD) can be significantly reduced by leveraging forward-mode AD when training PINN.
We propose a network architecture, called separable PINN (SPINN), which can facilitate forward-mode AD for more efficient computation.
arXiv Detail & Related papers (2022-11-16T08:46:52Z) - FO-PINNs: A First-Order formulation for Physics Informed Neural Networks [1.8874301050354767]
Physics-Informed Neural Networks (PINNs) are a class of deep learning neural networks that learn the response of a physical system without any simulation data.
PINNs are successfully used for solving forward and inverse problems, but their accuracy decreases significantly for parameterized systems.
We present first-order physics-informed neural networks (FO-PINNs) that are trained using a first-order formulation of the PDE loss function.
arXiv Detail & Related papers (2022-10-25T20:25:33Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Belief Propagation Neural Networks [103.97004780313105]
We introduce belief propagation neural networks (BPNNs)
BPNNs operate on factor graphs and generalize Belief propagation (BP)
We show that BPNNs converges 1.7x faster on Ising models while providing tighter bounds.
On challenging model counting problems, BPNNs compute estimates 100's of times faster than state-of-the-art handcrafted methods.
arXiv Detail & Related papers (2020-07-01T07:39:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.