Physics-Trained Neural Network as Inverse Problem Solver for Potential Fields: An Example of Downward Continuation between Arbitrary Surfaces
- URL: http://arxiv.org/abs/2502.05190v1
- Date: Sun, 26 Jan 2025 15:45:19 GMT
- Title: Physics-Trained Neural Network as Inverse Problem Solver for Potential Fields: An Example of Downward Continuation between Arbitrary Surfaces
- Authors: Jing Sun, Lu Li, Liang Zhang,
- Abstract summary: Downward continuation is a critical task in potential field processing, including gravity and magnetic fields.<n>We propose a new physics-trained deep neural network (DNN)-based solution for this task.<n>We test the proposed method on both synthetic magnetic data and real-world magnetic data from West Antarctica.
- Score: 9.727358008769501
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Downward continuation is a critical task in potential field processing, including gravity and magnetic fields, which aims to transfer data from one observation surface to another that is closer to the source of the field. Its effectiveness directly impacts the success of detecting and highlighting subsurface anomalous sources. We treat downward continuation as an inverse problem that relies on solving a forward problem defined by the formula for upward continuation, and we propose a new physics-trained deep neural network (DNN)-based solution for this task. We hard-code the upward continuation process into the DNN's learning framework, where the DNN itself learns to act as the inverse problem solver and can perform downward continuation without ever being shown any ground truth data. We test the proposed method on both synthetic magnetic data and real-world magnetic data from West Antarctica. The preliminary results demonstrate its effectiveness through comparison with selected benchmarks, opening future avenues for the combined use of DNNs and established geophysical theories to address broader potential field inverse problems, such as density and geometry modelling.
Related papers
- Data-Driven and Theory-Guided Pseudo-Spectral Seismic Imaging Using Deep Neural Network Architectures [0.0]
Full Waveform Inversion (FWI) reconstructs high-resolution subsurface models.
FWI faces challenges with solver selection and data availability.
Deep Learning (DL) offers a promising alternative, bridging data-driven and physics-based methods.
This thesis integrates pseudo-spectral FWI into DL, formulating both data-driven and theory-guided approaches.
arXiv Detail & Related papers (2025-02-26T05:46:53Z) - Physics-Informed Deep Learning of Rate-and-State Fault Friction [0.0]
We develop a multi-network PINN for both the forward problem and for direct inversion of nonlinear fault friction parameters.
We present the computational PINN framework for strike-slip faults in 1D and 2D subject to rate-and-state friction.
We find that the network for the parameter inversion at the fault performs much better than the network for material displacements to which it is coupled.
arXiv Detail & Related papers (2023-12-14T23:53:25Z) - A Test-Time Learning Approach to Reparameterize the Geophysical Inverse Problem with a Convolutional Neural Network [1.7396556690675236]
Explicit regularization is often used, but there are opportunities to explore the implicit regularization effects that are inherent in a Neural Network structure.
Researchers have discovered that the Convolutional Neural Network (CNN) architecture inherently enforces a regularization.
In this study, we examine the applicability of this implicit regularization to geophysical inversions.
arXiv Detail & Related papers (2023-12-07T23:53:30Z) - Solving Inverse Problems with Latent Diffusion Models via Hard Data Consistency [7.671153315762146]
Training diffusion models in the pixel space are both data-intensive and computationally demanding.
Latent diffusion models, which operate in a much lower-dimensional space, offer a solution to these challenges.
We propose textitReSample, an algorithm that can solve general inverse problems with pre-trained latent diffusion models.
arXiv Detail & Related papers (2023-07-16T18:42:01Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Learning to Solve PDE-constrained Inverse Problems with Graph Networks [51.89325993156204]
In many application domains across science and engineering, we are interested in solving inverse problems with constraints defined by a partial differential equation (PDE)
Here we explore GNNs to solve such PDE-constrained inverse problems.
We demonstrate computational speedups of up to 90x using GNNs compared to principled solvers.
arXiv Detail & Related papers (2022-06-01T18:48:01Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Physics Informed Convex Artificial Neural Networks (PICANNs) for Optimal
Transport based Density Estimation [13.807546494746207]
We propose a Deep Learning approach to solve the continuous Optimal Mass Transport problem.
We focus on the ubiquitous density estimation and generative modeling tasks in statistics and machine learning.
arXiv Detail & Related papers (2021-04-02T18:44:11Z) - A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in
the Wasserstein Space [2.66512000865131]
We build the connection of deep neural network (DNN) and dynamic system.
By diving the optimal transport theory, we find DNN with weight decay attempts to learn the geodesic curve in the Wasserstein space.
We conclude a mathematical principle of deep learning is to learn the geodesic curve in the Wasserstein space.
arXiv Detail & Related papers (2021-02-18T09:37:49Z) - Solving Sparse Linear Inverse Problems in Communication Systems: A Deep
Learning Approach With Adaptive Depth [51.40441097625201]
We propose an end-to-end trainable deep learning architecture for sparse signal recovery problems.
The proposed method learns how many layers to execute to emit an output, and the network depth is dynamically adjusted for each task in the inference phase.
arXiv Detail & Related papers (2020-10-29T06:32:53Z) - Boosting Deep Neural Networks with Geometrical Prior Knowledge: A Survey [77.99182201815763]
Deep Neural Networks (DNNs) achieve state-of-the-art results in many different problem settings.
DNNs are often treated as black box systems, which complicates their evaluation and validation.
One promising field, inspired by the success of convolutional neural networks (CNNs) in computer vision tasks, is to incorporate knowledge about symmetric geometrical transformations.
arXiv Detail & Related papers (2020-06-30T14:56:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.