Equilibrium Conserving Neural Operators for Super-Resolution Learning
- URL: http://arxiv.org/abs/2504.13422v1
- Date: Fri, 18 Apr 2025 02:47:53 GMT
- Title: Equilibrium Conserving Neural Operators for Super-Resolution Learning
- Authors: Vivek Oommen, Andreas E. Robertson, Daniel Diaz, Coleman Alleman, Zhen Zhang, Anthony D. Rollett, George E. Karniadakis, Rémi Dingreville,
- Abstract summary: We introduce a framework for super-resolution learning in solid mechanics problems.<n>Our approach allows one to train a high-resolution neural network using only low-resolution data.<n>We evaluate this ECO-based super-resolution framework that strongly enforces conservation-laws in the predicted solutions.
- Score: 2.062348453578637
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural surrogate solvers can estimate solutions to partial differential equations in physical problems more efficiently than standard numerical methods, but require extensive high-resolution training data. In this paper, we break this limitation; we introduce a framework for super-resolution learning in solid mechanics problems. Our approach allows one to train a high-resolution neural network using only low-resolution data. Our Equilibrium Conserving Operator (ECO) architecture embeds known physics directly into the network to make up for missing high-resolution information during training. We evaluate this ECO-based super-resolution framework that strongly enforces conservation-laws in the predicted solutions on two working examples: embedded pores in a homogenized matrix and randomly textured polycrystalline materials. ECO eliminates the reliance on high-fidelity data and reduces the upfront cost of data collection by two orders of magnitude, offering a robust pathway for resource-efficient surrogate modeling in materials modeling. ECO is readily generalizable to other physics-based problems.
Related papers
- Paving the way for scientific foundation models: enhancing generalization and robustness in PDEs with constraint-aware pre-training [49.8035317670223]
A scientific foundation model (SciFM) is emerging as a promising tool for learning transferable representations across diverse domains.<n>We propose incorporating PDE residuals into pre-training either as the sole learning signal or in combination with data loss to compensate for limited or infeasible training data.<n>Our results show that pre-training with PDE constraints significantly enhances generalization, outperforming models trained solely on solution data.
arXiv Detail & Related papers (2025-03-24T19:12:39Z) - Solving Differential Equations with Constrained Learning [8.522558872274276]
(Partial) differential equations (PDEs) are fundamental tools for describing natural phenomena, making their solution crucial in science and engineering.
Traditional methods, such as the finite element method, provide reliable solutions, but their accuracy is tied to the use of computationally intensive fine meshes.
This paper addresses these challenges by developing a science-constrained learning (SCL) framework.
It demonstrates that finding a (weak) solution of a PDE is equivalent to solving a constrained learning problem with worst-case losses.
arXiv Detail & Related papers (2024-10-30T08:20:39Z) - OTClean: Data Cleaning for Conditional Independence Violations using
Optimal Transport [51.6416022358349]
sys is a framework that harnesses optimal transport theory for data repair under Conditional Independence (CI) constraints.
We develop an iterative algorithm inspired by Sinkhorn's matrix scaling algorithm, which efficiently addresses high-dimensional and large-scale data.
arXiv Detail & Related papers (2024-03-04T18:23:55Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Physics-aware deep learning framework for linear elasticity [0.0]
The paper presents an efficient and robust data-driven deep learning (DL) computational framework for linear continuum elasticity problems.
For an accurate representation of the field variables, a multi-objective loss function is proposed.
Several benchmark problems including the Airimaty solution to elasticity and the Kirchhoff-Love plate problem are solved.
arXiv Detail & Related papers (2023-02-19T20:33:32Z) - AttNS: Attention-Inspired Numerical Solving For Limited Data Scenarios [51.94807626839365]
We propose the attention-inspired numerical solver (AttNS) to solve differential equations due to limited data.<n>AttNS is inspired by the effectiveness of attention modules in Residual Neural Networks (ResNet) in enhancing model generalization and robustness.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Spatio-Temporal Super-Resolution of Dynamical Systems using
Physics-Informed Deep-Learning [0.0]
We propose a physics-informed deep learning-based framework to enhance spatial and temporal resolution of PDE solutions.
The framework consists of two trainable modules independently super-resolve (both in space and time) PDE solutions.
The proposed framework is well-suited for integration with traditional numerical methods to reduce computational complexity during engineering design.
arXiv Detail & Related papers (2022-12-08T18:30:18Z) - PhySRNet: Physics informed super-resolution network for application in
computational solid mechanics [0.0]
This work aims at developing a physics-informed deep learning based super-resolution framework (PhySRNet)
It enables reconstruction of high-resolution deformation fields from their low-resolution counterparts without requiring high-resolution labeled data.
arXiv Detail & Related papers (2022-06-30T17:51:50Z) - Machine Learning-Accelerated Computational Solid Mechanics: Application
to Linear Elasticity [0.0]
We leverage the governing equations and boundary conditions of the physical system to train the model without using any high-resolution labeled data.
We demonstrate that the super-resolved fields match the accuracy of an advanced numerical solver running at 400 times the coarse mesh resolution.
arXiv Detail & Related papers (2021-12-16T07:39:50Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Automatically Learning Compact Quality-aware Surrogates for Optimization
Problems [55.94450542785096]
Solving optimization problems with unknown parameters requires learning a predictive model to predict the values of the unknown parameters and then solving the problem using these values.
Recent work has shown that including the optimization problem as a layer in a complex training model pipeline results in predictions of iteration of unobserved decision making.
We show that we can improve solution quality by learning a low-dimensional surrogate model of a large optimization problem.
arXiv Detail & Related papers (2020-06-18T19:11:54Z) - An Online Method for A Class of Distributionally Robust Optimization
with Non-Convex Objectives [54.29001037565384]
We propose a practical online method for solving a class of online distributionally robust optimization (DRO) problems.
Our studies demonstrate important applications in machine learning for improving the robustness of networks.
arXiv Detail & Related papers (2020-06-17T20:19:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.