Differentiable Multi-Fidelity Fusion: Efficient Learning of Physics
Simulations with Neural Architecture Search and Transfer Learning
- URL: http://arxiv.org/abs/2306.06904v1
- Date: Mon, 12 Jun 2023 07:18:13 GMT
- Title: Differentiable Multi-Fidelity Fusion: Efficient Learning of Physics
Simulations with Neural Architecture Search and Transfer Learning
- Authors: Yuwen Deng, Wang Kang, Wei W. Xing
- Abstract summary: We propose the differentiable mf (DMF) model, which leverages neural architecture search (NAS) to automatically search the suitable model architecture for different problems.
DMF can efficiently learn the physics simulations with only a few high-fidelity training samples, and outperform the state-of-the-art methods with a significant margin.
- Score: 1.0024450637989093
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With rapid progress in deep learning, neural networks have been widely used
in scientific research and engineering applications as surrogate models.
Despite the great success of neural networks in fitting complex systems, two
major challenges still remain: i) the lack of generalization on different
problems/datasets, and ii) the demand for large amounts of simulation data that
are computationally expensive. To resolve these challenges, we propose the
differentiable \mf (DMF) model, which leverages neural architecture search
(NAS) to automatically search the suitable model architecture for different
problems, and transfer learning to transfer the learned knowledge from
low-fidelity (fast but inaccurate) data to high-fidelity (slow but accurate)
model. Novel and latest machine learning techniques such as hyperparameters
search and alternate learning are used to improve the efficiency and robustness
of DMF. As a result, DMF can efficiently learn the physics simulations with
only a few high-fidelity training samples, and outperform the state-of-the-art
methods with a significant margin (with up to 58$\%$ improvement in RMSE) based
on a variety of synthetic and practical benchmark problems.
Related papers
- FMint: Bridging Human Designed and Data Pretrained Models for Differential Equation Foundation Model [5.748690310135373]
We propose a novel multi-modal foundation model, named textbfFMint, to bridge the gap between human-designed and data-driven models.
Built on a decoder-only transformer architecture with in-context learning, FMint utilizes both numerical and textual data to learn a universal error correction scheme.
Our results demonstrate the effectiveness of the proposed model in terms of both accuracy and efficiency compared to classical numerical solvers.
arXiv Detail & Related papers (2024-04-23T02:36:47Z) - Learning Controllable Adaptive Simulation for Multi-resolution Physics [86.8993558124143]
We introduce Learning controllable Adaptive simulation for Multi-resolution Physics (LAMP) as the first full deep learning-based surrogate model.
LAMP consists of a Graph Neural Network (GNN) for learning the forward evolution, and a GNN-based actor-critic for learning the policy of spatial refinement and coarsening.
We demonstrate that our LAMP outperforms state-of-the-art deep learning surrogate models, and can adaptively trade-off computation to improve long-term prediction error.
arXiv Detail & Related papers (2023-05-01T23:20:27Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Multi-fidelity Hierarchical Neural Processes [79.0284780825048]
Multi-fidelity surrogate modeling reduces the computational cost by fusing different simulation outputs.
We propose Multi-fidelity Hierarchical Neural Processes (MF-HNP), a unified neural latent variable model for multi-fidelity surrogate modeling.
We evaluate MF-HNP on epidemiology and climate modeling tasks, achieving competitive performance in terms of accuracy and uncertainty estimation.
arXiv Detail & Related papers (2022-06-10T04:54:13Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Efficient Model-Based Multi-Agent Mean-Field Reinforcement Learning [89.31889875864599]
We propose an efficient model-based reinforcement learning algorithm for learning in multi-agent systems.
Our main theoretical contributions are the first general regret bounds for model-based reinforcement learning for MFC.
We provide a practical parametrization of the core optimization problem.
arXiv Detail & Related papers (2021-07-08T18:01:02Z) - Model-Based Deep Learning [155.063817656602]
Signal processing, communications, and control have traditionally relied on classical statistical modeling techniques.
Deep neural networks (DNNs) use generic architectures which learn to operate from data, and demonstrate excellent performance.
We are interested in hybrid techniques that combine principled mathematical models with data-driven systems to benefit from the advantages of both approaches.
arXiv Detail & Related papers (2020-12-15T16:29:49Z) - Learning Mesh-Based Simulation with Graph Networks [20.29893312074383]
We introduce MeshGraphNets, a framework for learning mesh-based simulations using graph neural networks.
Our results show it can accurately predict the dynamics of a wide range of physical systems, including aerodynamics, structural mechanics, and cloth.
arXiv Detail & Related papers (2020-10-07T13:34:49Z) - Modeling System Dynamics with Physics-Informed Neural Networks Based on
Lagrangian Mechanics [3.214927790437842]
Two main modeling approaches often fail to meet requirements: first principles methods suffer from high bias, whereas data-driven modeling tends to have high variance.
We present physics-informed neural ordinary differential equations (PINODE), a hybrid model that combines the two modeling techniques to overcome the aforementioned problems.
Our findings are of interest for model-based control and system identification of mechanical systems.
arXiv Detail & Related papers (2020-05-29T15:10:43Z) - Transfer learning based multi-fidelity physics informed deep neural
network [0.0]
The governing differential equation is either not known or known in an approximate sense.
This paper presents a novel multi-fidelity physics informed deep neural network (MF-PIDNN)
MF-PIDNN blends physics informed and data-driven deep learning techniques by using the concept of transfer learning.
arXiv Detail & Related papers (2020-05-19T13:57:48Z) - Real-time Federated Evolutionary Neural Architecture Search [14.099753950531456]
Federated learning is a distributed machine learning approach to privacy preservation.
We propose an evolutionary approach to real-time federated neural architecture search that not only optimize the model performance but also reduces the local payload.
This way, we effectively reduce computational and communication costs required for evolutionary optimization and avoid big performance fluctuations of the local models.
arXiv Detail & Related papers (2020-03-04T17:03:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.