Building Flexible Machine Learning Models for Scientific Computing at
Scale
- URL: http://arxiv.org/abs/2402.16014v1
- Date: Sun, 25 Feb 2024 07:19:01 GMT
- Title: Building Flexible Machine Learning Models for Scientific Computing at
Scale
- Authors: Tianyu Chen, Haoyi Zhou, Ying Li, Hao Wang, Chonghan Gao, Shanghang
Zhang, Jianxin Li
- Abstract summary: We introduce OmniArch, a paradigm-shifting approach for building foundation models in scientific computing.
Pre-trained on the comprehensive PDEBench dataset, OmniArch sets new performance benchmarks for 1D, 2D and 3D PDEs.
The model's representations further extend to inverse problem-solving, highlighting the transformative potential of AI-enabled Scientific Computing(AI4SC)
- Score: 36.50343666079548
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Foundation models have revolutionized knowledge acquisition across domains,
and our study introduces OmniArch, a paradigm-shifting approach designed for
building foundation models in multi-physics scientific computing. OmniArch's
pre-training involves a versatile pipeline that processes multi-physics
spatio-temporal data, casting forward problem learning into scalable
auto-regressive tasks, while our novel Physics-Informed Reinforcement Learning
(PIRL) technique during fine-tuning ensures alignment with physical laws.
Pre-trained on the comprehensive PDEBench dataset, OmniArch not only sets new
performance benchmarks for 1D, 2D and 3D PDEs but also demonstrates exceptional
adaptability to new physics via few-shot and zero-shot learning approaches. The
model's representations further extend to inverse problem-solving, highlighting
the transformative potential of AI-enabled Scientific Computing(AI4SC)
foundation models for engineering applications and physics discovery.
Related papers
- PhyRecon: Physically Plausible Neural Scene Reconstruction [81.73129450090684]
We introduce PhyRecon, the first approach to leverage both differentiable rendering and differentiable physics simulation to learn implicit surface representations.
PhyRecon features a novel differentiable particle-based physical simulator built on neural implicit representations.
Our results also exhibit superior physical stability in physical simulators, with at least a 40% improvement across all datasets.
arXiv Detail & Related papers (2024-04-25T15:06:58Z) - A foundational neural operator that continuously learns without
forgetting [1.0878040851638]
We introduce the concept of the Neural Combinatorial Wavelet Neural Operator (NCWNO) as a foundational model for scientific computing.
The NCWNO is specifically designed to excel in learning from a diverse spectrum of physics and continuously adapt to the solution operators associated with parametric partial differential equations (PDEs)
The proposed foundational model offers two key advantages: (i) it can simultaneously learn solution operators for multiple parametric PDEs, and (ii) it can swiftly generalize to new parametric PDEs with minimal fine-tuning.
arXiv Detail & Related papers (2023-10-29T03:20:10Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Learning Controllable Adaptive Simulation for Multi-resolution Physics [86.8993558124143]
We introduce Learning controllable Adaptive simulation for Multi-resolution Physics (LAMP) as the first full deep learning-based surrogate model.
LAMP consists of a Graph Neural Network (GNN) for learning the forward evolution, and a GNN-based actor-critic for learning the policy of spatial refinement and coarsening.
We demonstrate that our LAMP outperforms state-of-the-art deep learning surrogate models, and can adaptively trade-off computation to improve long-term prediction error.
arXiv Detail & Related papers (2023-05-01T23:20:27Z) - SNUG: Self-Supervised Neural Dynamic Garments [14.83072352654608]
We present a self-supervised method to learn dynamic 3D deformations of garments worn by parametric human bodies.
This allows us to learn models for interactive garments, including dynamic deformations and fine wrinkles, with two orders of magnitude speed up in training time.
arXiv Detail & Related papers (2022-04-05T13:50:21Z) - Physics-Guided Deep Learning for Dynamical Systems: A survey [5.733401663293044]
Traditional physics-based models are interpretable but rely on rigid assumptions.
Deep learning provides novel alternatives for efficiently recognizing complex patterns and emulating nonlinear dynamics.
It aims to take the best from both physics-based modeling and state-of-the-art DL models to better solve scientific problems.
arXiv Detail & Related papers (2021-07-02T20:59:03Z) - PlasticineLab: A Soft-Body Manipulation Benchmark with Differentiable
Physics [89.81550748680245]
We introduce a new differentiable physics benchmark called PasticineLab.
In each task, the agent uses manipulators to deform the plasticine into the desired configuration.
We evaluate several existing reinforcement learning (RL) methods and gradient-based methods on this benchmark.
arXiv Detail & Related papers (2021-04-07T17:59:23Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Gradient-Based Training and Pruning of Radial Basis Function Networks
with an Application in Materials Physics [0.24792948967354234]
We propose a gradient-based technique for training radial basis function networks with an efficient and scalable open-source implementation.
We derive novel closed-form optimization criteria for pruning the models for continuous as well as binary data.
arXiv Detail & Related papers (2020-04-06T11:32:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.