Evolutionary Optimization of Physics-Informed Neural Networks: Advancing Generalizability by the Baldwin Effect
- URL: http://arxiv.org/abs/2312.03243v3
- Date: Wed, 23 Apr 2025 16:21:00 GMT
- Title: Evolutionary Optimization of Physics-Informed Neural Networks: Advancing Generalizability by the Baldwin Effect
- Authors: Jian Cheng Wong, Chin Chun Ooi, Abhishek Gupta, Pao-Hsiung Chiu, Joshua Shao Zheng Low, My Ha Dao, Yew-Soon Ong,
- Abstract summary: Physics-informed neural networks (PINNs) are at the forefront of scientific machine learning.<n>This paper proposes a pioneering approach to advance the generalizability of PINNs through the framework of Baldwinian evolution.
- Score: 22.57730294475146
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physics-informed neural networks (PINNs) are at the forefront of scientific machine learning, making possible the creation of machine intelligence that is cognizant of physical laws and able to accurately simulate them. However, today's PINNs are often trained for a single physics task and require computationally expensive re-training for each new task, even for tasks from similar physics domains. To address this limitation, this paper proposes a pioneering approach to advance the generalizability of PINNs through the framework of Baldwinian evolution. Drawing inspiration from the neurodevelopment of precocial species that have evolved to learn, predict and react quickly to their environment, we envision PINNs that are pre-wired with connection strengths inducing strong biases towards efficient learning of physics. A novel two-stage stochastic programming formulation coupling evolutionary selection pressure (based on proficiency over a distribution of physics tasks) with lifetime learning (to specialize on a sampled subset of those tasks) is proposed to instantiate the Baldwin effect. The evolved Baldwinian-PINNs demonstrate fast and physics-compliant prediction capabilities across a range of empirically challenging problem instances with more than an order of magnitude improvement in prediction accuracy at a fraction of the computation cost compared to state-of-the-art gradient-based meta-learning methods. For example, when solving the diffusion-reaction equation, a 70x improvement in accuracy was obtained while taking 700x less computational time. This paper thus marks a leap forward in the meta-learning of PINNs as generalizable physics solvers. Sample codes are available at https://github.com/chiuph/Baldwinian-PINN.
Related papers
- Evolutionary Optimization of Physics-Informed Neural Networks: Survey and Prospects [23.92936460045325]
Physics-informed neural networks (PINNs) are infused with mathematically expressible laws of nature into their training loss function.
PINNs provide advantages over purely data-driven models in limited-data regimes.
This review examines PINNs for the first time in terms of model optimization and generalization.
arXiv Detail & Related papers (2025-01-11T15:45:11Z) - Advancing Physics Data Analysis through Machine Learning and Physics-Informed Neural Networks [0.0]
This project evaluates various machine learning (ML) algorithms for physics data analysis.
We apply these techniques to a binary classification task that distinguishes the experimental viability of simulated scenarios.
XGBoost emerged as the preferred choice among the evaluated machine learning algorithms for its speed and effectiveness.
arXiv Detail & Related papers (2024-10-18T11:05:52Z) - Metamizer: a versatile neural optimizer for fast and accurate physics simulations [4.717325308876749]
We introduce Metamizer, a novel neural network that iteratively solves a wide range of physical systems with high accuracy.
We demonstrate that Metamizer achieves unprecedented accuracy for deep learning based approaches.
Our results suggest that Metamizer could have a profound impact on future numerical solvers.
arXiv Detail & Related papers (2024-10-10T11:54:31Z) - Improved physics-informed neural network in mitigating gradient related failures [11.356695216531328]
Physics-informed neural networks (PINNs) integrate fundamental physical principles with advanced data-driven techniques.
PINNs face persistent challenges with stiffness in gradient flow, which limits their predictive capabilities.
This paper presents an improved PINN to mitigate gradient-related failures.
arXiv Detail & Related papers (2024-07-28T07:58:10Z) - Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - Architectural Strategies for the optimization of Physics-Informed Neural
Networks [30.92757082348805]
Physics-informed neural networks (PINNs) offer a promising avenue for tackling both forward and inverse problems in partial differential equations (PDEs)
Despite their remarkable empirical success, PINNs have garnered a reputation for their notorious training challenges across a spectrum of PDEs.
arXiv Detail & Related papers (2024-02-05T04:15:31Z) - ST-PINN: A Self-Training Physics-Informed Neural Network for Partial
Differential Equations [13.196871939441273]
Partial differential equations (PDEs) are an essential computational kernel in physics and engineering.
With the advance of deep learning, physics-informed neural networks (PINNs) have shown great potential for fast PDE solving in various applications.
We propose a self-training physics-informed neural network, ST-PINN, to address the issue of low accuracy and convergence problems of existing PINNs.
arXiv Detail & Related papers (2023-06-15T15:49:13Z) - Toward stochastic neural computing [11.955322183964201]
We propose a theory of neural computing in which streams of noisy inputs are transformed and processed through populations of spiking neurons.
We demonstrate the application of our method to Intel's Loihi neuromorphic hardware.
arXiv Detail & Related papers (2023-05-23T12:05:35Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Neuroevolution of Physics-Informed Neural Nets: Benchmark Problems and
Comparative Results [25.12291688711645]
Physics-informed neural networks (PINNs) are one of the key techniques at the forefront of recent advances.
PINNs' unique loss formulations lead to a high degree of complexity and ruggedness that may not be conducive for gradient descent.
Neuroevolution algorithms, with their superior global search capacity, may be a better choice for PINNs.
arXiv Detail & Related papers (2022-12-15T05:54:16Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.