Thermodynamically-Informed Iterative Neural Operators for Heterogeneous Elastic Localization
- URL: http://arxiv.org/abs/2411.06529v1
- Date: Sun, 10 Nov 2024 17:11:49 GMT
- Title: Thermodynamically-Informed Iterative Neural Operators for Heterogeneous Elastic Localization
- Authors: Conlain Kelly, Surya R. Kalidindi,
- Abstract summary: In this work, we focus on a canonical problem in computational mechanics: prediction of local elastic deformation fields over heterogeneous material structures.
We construct a hybrid approximation for the coefficient-to-solution map using a Thermodynamic-informed Iterative Neural Operator.
Through an extensive series of case studies, we elucidate the advantages of these design choices in terms of efficiency, accuracy, and flexibility.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Engineering problems frequently require solution of governing equations with spatially-varying discontinuous coefficients. Even for linear elliptic problems, mapping large ensembles of coefficient fields to solutions can become a major computational bottleneck using traditional numerical solvers. Furthermore, machine learning methods such as neural operators struggle to fit these maps due to sharp transitions and high contrast in the coefficient fields and a scarcity of informative training data. In this work, we focus on a canonical problem in computational mechanics: prediction of local elastic deformation fields over heterogeneous material structures subjected to periodic boundary conditions. We construct a hybrid approximation for the coefficient-to-solution map using a Thermodynamically-informed Iterative Neural Operator (TherINO). Rather than using coefficient fields as direct inputs and iterating over a learned latent space, we employ thermodynamic encodings -- drawn from the constitutive equations -- and iterate over the solution space itself. Through an extensive series of case studies, we elucidate the advantages of these design choices in terms of efficiency, accuracy, and flexibility. We also analyze the model's stability and extrapolation properties on out-of-distribution coefficient fields and demonstrate an improved speed-accuracy tradeoff for predicting elastic quantities of interest.
Related papers
- Finite Operator Learning: Bridging Neural Operators and Numerical Methods for Efficient Parametric Solution and Optimization of PDEs [0.0]
We introduce a method that combines neural operators, physics-informed machine learning, and standard numerical methods for solving PDEs.
We can parametrically solve partial differential equations in a data-free manner and provide accurate sensitivities.
Our study focuses on the steady-state heat equation within heterogeneous materials.
arXiv Detail & Related papers (2024-07-04T21:23:12Z) - Statistical Mechanics of Dynamical System Identification [3.1484174280822845]
We develop a statistical mechanical approach to analyze sparse equation discovery algorithms.
In this framework, statistical mechanics offers tools to analyze the interplay between complexity and fitness.
arXiv Detail & Related papers (2024-03-04T04:32:28Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Waveformer for modelling dynamical systems [1.0878040851638]
We propose "waveformer", a novel operator learning approach for learning solutions of dynamical systems.
The proposed waveformer exploits wavelet transform to capture the spatial multi-scale behavior of the solution field and transformers.
We show that the proposed Waveformer can learn the solution operator with high accuracy, outperforming existing state-of-the-art operator learning algorithms by up to an order.
arXiv Detail & Related papers (2023-10-08T03:34:59Z) - Solving Forward and Inverse Problems of Contact Mechanics using
Physics-Informed Neural Networks [0.0]
We deploy PINNs in a mixed-variable formulation enhanced by output transformation to enforce hard and soft constraints.
We show that PINNs can serve as pure partial equation (PDE) solver, as data-enhanced forward model, and as fast-to-evaluate surrogate model.
arXiv Detail & Related papers (2023-08-24T11:31:24Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Approximate Latent Force Model Inference [1.3927943269211591]
latent force models offer an interpretable alternative to purely data driven tools for inference in dynamical systems.
We show that a neural operator approach can scale our model to thousands of instances, enabling fast, distributed computation.
arXiv Detail & Related papers (2021-09-24T09:55:00Z) - Recurrent Localization Networks applied to the Lippmann-Schwinger
Equation [0.0]
We present a novel machine learning approach for solving equations of the generalized Lippmann-Schwinger (L-S) type.
As part of a learning-based loop unrolling, we use a recurrent convolutional neural network to iteratively solve the governing equations for a field of interest.
We demonstrate our learning approach on the two-phase elastic localization problem, where it achieves excellent accuracy on the predictions of the local (i.e., voxel-level) elastic strains.
arXiv Detail & Related papers (2021-01-29T20:54:17Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.