NeuralOGCM: Differentiable Ocean Modeling with Learnable Physics
- URL: http://arxiv.org/abs/2512.11525v1
- Date: Fri, 12 Dec 2025 12:53:46 GMT
- Title: NeuralOGCM: Differentiable Ocean Modeling with Learnable Physics
- Authors: Hao Wu, Yuan Gao, Fan Xu, Fan Zhang, Guangliang Liu, Yuxuan Liang, Xiaomeng Huang,
- Abstract summary: We propose NeuralOGCM, an ocean modeling framework that fuses differentiable programming with deep learning.<n>The learnable physics integration captures large-scale, deterministic physical evolution, and transforms key physical parameters into learnable parameters.<n>A deep neural network learns to correct for subgrid-scale processes and discretization errors not captured by the physics model.<n>Experiments demonstrate that NeuralOGCM maintains long-term stability and physical consistency, significantly outperforming traditional numerical models in speed and pure AI baselines in accuracy.
- Score: 38.88216084180426
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: High-precision scientific simulation faces a long-standing trade-off between computational efficiency and physical fidelity. To address this challenge, we propose NeuralOGCM, an ocean modeling framework that fuses differentiable programming with deep learning. At the core of NeuralOGCM is a fully differentiable dynamical solver, which leverages physics knowledge as its core inductive bias. The learnable physics integration captures large-scale, deterministic physical evolution, and transforms key physical parameters (e.g., diffusion coefficients) into learnable parameters, enabling the model to autonomously optimize its physical core via end-to-end training. Concurrently, a deep neural network learns to correct for subgrid-scale processes and discretization errors not captured by the physics model. Both components work in synergy, with their outputs integrated by a unified ODE solver. Experiments demonstrate that NeuralOGCM maintains long-term stability and physical consistency, significantly outperforming traditional numerical models in speed and pure AI baselines in accuracy. Our work paves a new path for building fast, stable, and physically-plausible models for scientific computing.
Related papers
- Towards a Physics Foundation Model [2.109902626434734]
We present the General Physics Transformer (GPhyT), trained on 1.8 TB of diverse simulation data.<n>GPhyT achieves superior performance across multiple physics domains, outperforming specialized architectures by up to 29x.<n>By establishing that a single model can learn general physical principles from data alone, this work opens the path toward a universal Physics Foundation Model.
arXiv Detail & Related papers (2025-09-17T08:19:57Z) - OmniFluids: Physics Pre-trained Modeling of Fluid Dynamics [25.066485418709114]
We propose OmniFluids, a pure physics pre-trained model that captures fundamental fluid dynamics laws and adapts efficiently to diverse downstream tasks.<n>We develop a training framework combining physics-only pre-training, coarse-grid operator distillation, and few-shot fine-tuning.<n>Tests show that OmniFluids outperforms state-of-the-art AI-driven methods in terms of flow field prediction and statistics.
arXiv Detail & Related papers (2025-06-12T16:23:02Z) - Physics Encoded Blocks in Residual Neural Network Architectures for Digital Twin Models [2.8720819157502344]
Physics Informed Machine Learning has emerged as a popular approach for modeling and simulation in digital twins.<n>This paper presents a generic approach based on a novel physics-encoded residual neural network architecture.<n>Our method integrates differentiable physics blocks-implementing mathematical operators from physics-based models with feed-forward learning blocks.
arXiv Detail & Related papers (2024-11-18T11:58:20Z) - Transport-Embedded Neural Architecture: Redefining the Landscape of physics aware neural models in fluid mechanics [0.0]
A physical problem, the Taylor-Green vortex, defined on a bi-periodic domain, is used as a benchmark to evaluate the performance of both the standard physics-informed neural network and our model.
Results exhibit that while the standard physics-informed neural network fails to predict the solution accurately and merely returns the initial condition for the entire time span, our model successfully captures the temporal changes in the physics.
arXiv Detail & Related papers (2024-10-05T10:32:51Z) - Liquid Fourier Latent Dynamics Networks for fast GPU-based numerical simulations in computational cardiology [0.0]
We propose an extension of Latent Dynamics Networks (LDNets) to create parameterized space-time surrogate models for multiscale and multiphysics sets of highly nonlinear differential equations on complex geometries.
LFLDNets employ a neurologically-inspired, sparse liquid neural network for temporal dynamics, relaxing the requirement of a numerical solver for time advancement and leading to superior performance in terms of parameters, accuracy, efficiency and learned trajectories.
arXiv Detail & Related papers (2024-08-19T09:14:25Z) - DiffuseBot: Breeding Soft Robots With Physics-Augmented Generative
Diffusion Models [102.13968267347553]
We present DiffuseBot, a physics-augmented diffusion model that generates soft robot morphologies capable of excelling in a wide spectrum of tasks.
We showcase a range of simulated and fabricated robots along with their capabilities.
arXiv Detail & Related papers (2023-11-28T18:58:48Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Data-driven modeling of Landau damping by physics-informed neural
networks [4.728411962159049]
We construct a multi-moment fluid model with an implicit fluid closure included in the neural network using machine learning.
The model reproduces the time evolution of the electric field energy, including its damping rate, and the plasma dynamics from the kinetic simulations.
This work sheds light on the accurate and efficient modeling of large-scale systems, which can be extended to complex multiscale laboratory, space, and astrophysical plasma physics problems.
arXiv Detail & Related papers (2022-11-02T10:33:38Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - PlasticineLab: A Soft-Body Manipulation Benchmark with Differentiable
Physics [89.81550748680245]
We introduce a new differentiable physics benchmark called PasticineLab.
In each task, the agent uses manipulators to deform the plasticine into the desired configuration.
We evaluate several existing reinforcement learning (RL) methods and gradient-based methods on this benchmark.
arXiv Detail & Related papers (2021-04-07T17:59:23Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.