Towards a Physics Foundation Model
- URL: http://arxiv.org/abs/2509.13805v2
- Date: Fri, 26 Sep 2025 05:59:55 GMT
- Title: Towards a Physics Foundation Model
- Authors: Florian Wiesner, Matthias Wessling, Stephen Baek,
- Abstract summary: We present the General Physics Transformer (GPhyT), trained on 1.8 TB of diverse simulation data.<n>GPhyT achieves superior performance across multiple physics domains, outperforming specialized architectures by up to 29x.<n>By establishing that a single model can learn general physical principles from data alone, this work opens the path toward a universal Physics Foundation Model.
- Score: 2.109902626434734
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Foundation models have revolutionized natural language processing through a ``train once, deploy anywhere'' paradigm, where a single pre-trained model adapts to countless downstream tasks without retraining. Access to a Physics Foundation Model (PFM) would be transformative -- democratizing access to high-fidelity simulations, accelerating scientific discovery, and eliminating the need for specialized solver development. Yet current physics-aware machine learning approaches remain fundamentally limited to single, narrow domains and require retraining for each new system. We present the General Physics Transformer (GPhyT), trained on 1.8 TB of diverse simulation data, that demonstrates foundation model capabilities are achievable for physics. Our key insight is that transformers can learn to infer governing dynamics from context, enabling a single model to simulate fluid-solid interactions, shock waves, thermal convection, and multi-phase dynamics without being told the underlying equations. GPhyT achieves three critical breakthroughs: (1) superior performance across multiple physics domains, outperforming specialized architectures by up to 29x, (2) zero-shot generalization to entirely unseen physical systems through in-context learning, and (3) stable long-term predictions through 50-timestep rollouts. By establishing that a single model can learn generalizable physical principles from data alone, this work opens the path toward a universal PFM that could transform computational science and engineering.
Related papers
- GeoPT: Scaling Physics Simulation via Lifted Geometric Pre-Training [86.70824679370524]
We present GeoPT, a unified pre-trained model for general physics simulation based on lifted geometric pre-training.<n>The core idea is to augment geometry with synthetic dynamics, enabling dynamics-aware self-supervision without physics labels.
arXiv Detail & Related papers (2026-02-23T22:32:08Z) - PhysRVG: Physics-Aware Unified Reinforcement Learning for Video Generative Models [100.65199317765608]
Physical principles are fundamental to realistic visual simulation, but remain a significant oversight in transformer-based video generation.<n>We introduce a physics-aware reinforcement learning paradigm for video generation models that enforces physical collision rules directly in high-dimensional spaces.<n>We extend this paradigm to a unified framework, termed Mimicry-Discovery Cycle (MDcycle), which allows substantial fine-tuning.
arXiv Detail & Related papers (2026-01-16T08:40:10Z) - NeuralOGCM: Differentiable Ocean Modeling with Learnable Physics [38.88216084180426]
We propose NeuralOGCM, an ocean modeling framework that fuses differentiable programming with deep learning.<n>The learnable physics integration captures large-scale, deterministic physical evolution, and transforms key physical parameters into learnable parameters.<n>A deep neural network learns to correct for subgrid-scale processes and discretization errors not captured by the physics model.<n>Experiments demonstrate that NeuralOGCM maintains long-term stability and physical consistency, significantly outperforming traditional numerical models in speed and pure AI baselines in accuracy.
arXiv Detail & Related papers (2025-12-12T12:53:46Z) - Universal Physics Simulation: A Foundational Diffusion Approach [0.0]
We present the first foundational AI model for universal physics simulation that learns physical laws directly from boundary-condition data.<n>Our sketch-guided diffusion transformer approach reimagines computational physics by treating simulation as a conditional generation problem.<n>Unlike sequential time-stepping methods that accumulate errors over iterations, our approach bypasses temporal integration entirely.
arXiv Detail & Related papers (2025-07-13T18:12:34Z) - OmniFluids: Physics Pre-trained Modeling of Fluid Dynamics [25.066485418709114]
We propose OmniFluids, a pure physics pre-trained model that captures fundamental fluid dynamics laws and adapts efficiently to diverse downstream tasks.<n>We develop a training framework combining physics-only pre-training, coarse-grid operator distillation, and few-shot fine-tuning.<n>Tests show that OmniFluids outperforms state-of-the-art AI-driven methods in terms of flow field prediction and statistics.
arXiv Detail & Related papers (2025-06-12T16:23:02Z) - FORT: Forward-Only Regression Training of Normalizing Flows [85.66894616735752]
We revisit classical normalizing flows as one-step generative models with exact likelihoods.<n>We propose a novel, scalable training objective that does not require computing the expensive change of variable formula used in conventional maximum likelihood training.
arXiv Detail & Related papers (2025-06-01T20:32:27Z) - Physics Encoded Blocks in Residual Neural Network Architectures for Digital Twin Models [2.8720819157502344]
Physics Informed Machine Learning has emerged as a popular approach for modeling and simulation in digital twins.<n>This paper presents a generic approach based on a novel physics-encoded residual neural network architecture.<n>Our method integrates differentiable physics blocks-implementing mathematical operators from physics-based models with feed-forward learning blocks.
arXiv Detail & Related papers (2024-11-18T11:58:20Z) - PhyRecon: Physically Plausible Neural Scene Reconstruction [81.73129450090684]
We introduce PHYRECON, the first approach to leverage both differentiable rendering and differentiable physics simulation to learn implicit surface representations.
Central to this design is an efficient transformation between SDF-based implicit representations and explicit surface points.
Our results also exhibit superior physical stability in physical simulators, with at least a 40% improvement across all datasets.
arXiv Detail & Related papers (2024-04-25T15:06:58Z) - OmniArch: Building Foundation Model For Scientific Computing [35.41293100957156]
We present OmniArch, the first prototype aiming at solving multi-scale and multi-physics scientific computing problems with physical alignment.<n>As far as we know, we first conduct 1D-2D-3D united pre-training on the PDEBench, and it sets not only new performance benchmarks for 1D, 2D, and 3D PDEs but also demonstrates exceptional adaptability to new physics via in-context and zero-shot learning approaches.
arXiv Detail & Related papers (2024-02-25T07:19:01Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - PlasticineLab: A Soft-Body Manipulation Benchmark with Differentiable
Physics [89.81550748680245]
We introduce a new differentiable physics benchmark called PasticineLab.
In each task, the agent uses manipulators to deform the plasticine into the desired configuration.
We evaluate several existing reinforcement learning (RL) methods and gradient-based methods on this benchmark.
arXiv Detail & Related papers (2021-04-07T17:59:23Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Learning to Simulate Complex Physics with Graph Networks [68.43901833812448]
We present a machine learning framework and model implementation that can learn to simulate a wide variety of challenging physical domains.
Our framework---which we term "Graph Network-based Simulators" (GNS)--represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing.
Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time.
arXiv Detail & Related papers (2020-02-21T16:44:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.