Training Physics-Informed Neural Networks via Multi-Task Optimization
for Traffic Density Prediction
- URL: http://arxiv.org/abs/2307.03920v1
- Date: Sat, 8 Jul 2023 07:11:52 GMT
- Title: Training Physics-Informed Neural Networks via Multi-Task Optimization
for Traffic Density Prediction
- Authors: Bo Wang and A. K. Qin and Sajjad Shafiei and Hussein Dia and
Adriana-Simona Mihaita and Hanna Grzybowska
- Abstract summary: Physics-informed neural networks (PINNs) are a newly emerging research frontier in machine learning.
We propose a new PINN training framework based on the multi-task optimization (MTO) paradigm.
We implement the proposed framework and apply it to train the PINN for addressing the traffic density prediction problem.
- Score: 3.3823703740215865
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) are a newly emerging research
frontier in machine learning, which incorporate certain physical laws that
govern a given data set, e.g., those described by partial differential
equations (PDEs), into the training of the neural network (NN) based on such a
data set. In PINNs, the NN acts as the solution approximator for the PDE while
the PDE acts as the prior knowledge to guide the NN training, leading to the
desired generalization performance of the NN when facing the limited
availability of training data. However, training PINNs is a non-trivial task
largely due to the complexity of the loss composed of both NN and physical law
parts. In this work, we propose a new PINN training framework based on the
multi-task optimization (MTO) paradigm. Under this framework, multiple
auxiliary tasks are created and solved together with the given (main) task,
where the useful knowledge from solving one task is transferred in an adaptive
mode to assist in solving some other tasks, aiming to uplift the performance of
solving the main task. We implement the proposed framework and apply it to
train the PINN for addressing the traffic density prediction problem.
Experimental results demonstrate that our proposed training framework leads to
significant performance improvement in comparison to the traditional way of
training the PINN.
Related papers
- DeepONet as a Multi-Operator Extrapolation Model: Distributed Pretraining with Physics-Informed Fine-Tuning [6.635683993472882]
We propose a novel fine-tuning method to achieve multi-operator learning.
Our approach combines distributed learning to integrate data from various operators in pre-training, while physics-informed methods enable zero-shot fine-tuning.
arXiv Detail & Related papers (2024-11-11T18:58:46Z) - DNN Partitioning, Task Offloading, and Resource Allocation in Dynamic Vehicular Networks: A Lyapunov-Guided Diffusion-Based Reinforcement Learning Approach [49.56404236394601]
We formulate the problem of joint DNN partitioning, task offloading, and resource allocation in Vehicular Edge Computing.
Our objective is to minimize the DNN-based task completion time while guaranteeing the system stability over time.
We propose a Multi-Agent Diffusion-based Deep Reinforcement Learning (MAD2RL) algorithm, incorporating the innovative use of diffusion models.
arXiv Detail & Related papers (2024-06-11T06:31:03Z) - A Multi-Head Ensemble Multi-Task Learning Approach for Dynamical
Computation Offloading [62.34538208323411]
We propose a multi-head ensemble multi-task learning (MEMTL) approach with a shared backbone and multiple prediction heads (PHs)
MEMTL outperforms benchmark methods in both the inference accuracy and mean square error without requiring additional training data.
arXiv Detail & Related papers (2023-09-02T11:01:16Z) - Ensemble learning for Physics Informed Neural Networks: a Gradient Boosting approach [10.250994619846416]
We present a new training paradigm referred to as "gradient boosting" (GB)
Instead of learning the solution of a given PDE using a single neural network directly, our algorithm employs a sequence of neural networks to achieve a superior outcome.
This work also unlocks the door to employing ensemble learning techniques in PINNs.
arXiv Detail & Related papers (2023-02-25T19:11:44Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - A Comprehensive Survey on Distributed Training of Graph Neural Networks [59.785830738482474]
Graph neural networks (GNNs) have been demonstrated to be a powerful algorithmic model in broad application fields.
To scale GNN training up for large-scale and ever-growing graphs, the most promising solution is distributed training.
The volume of related research on distributed GNN training is exceptionally vast, accompanied by an extraordinarily rapid pace of publication.
arXiv Detail & Related papers (2022-11-10T06:22:12Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Scientific Machine Learning through Physics-Informed Neural Networks:
Where we are and What's next [5.956366179544257]
Physic-Informed Neural Networks (PINN) are neural networks (NNs) that encode model equations.
PINNs are nowadays used to solve PDEs, fractional equations, and integral-differential equations.
arXiv Detail & Related papers (2022-01-14T19:05:44Z) - Training multi-objective/multi-task collocation physics-informed neural
network with student/teachers transfer learnings [0.0]
This paper presents a PINN training framework that employs pre-training steps and a net-to-net knowledge transfer algorithm.
A multi-objective optimization algorithm may improve the performance of a physical-informed neural network with competing constraints.
arXiv Detail & Related papers (2021-07-24T00:43:17Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.