Physics-informed neural networks for diffraction tomography
- URL: http://arxiv.org/abs/2207.14230v1
- Date: Thu, 28 Jul 2022 16:56:50 GMT
- Title: Physics-informed neural networks for diffraction tomography
- Authors: Amirhossein Saba, Carlo Gigli, Ahmed B. Ayoub, and Demetri Psaltis
- Abstract summary: We propose a physics-informed neural network as the forward model for tomographic reconstructions of biological samples.
By training this network with the Helmholtz equation as a physical loss, we can predict the scattered field accurately.
- Score: 0.1199955563466263
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a physics-informed neural network as the forward model for
tomographic reconstructions of biological samples. We demonstrate that by
training this network with the Helmholtz equation as a physical loss, we can
predict the scattered field accurately. It will be shown that a pretrained
network can be fine-tuned for different samples and used for solving the
scattering problem much faster than other numerical solutions. We evaluate our
methodology with numerical and experimental results. Our physics-informed
neural networks can be generalized for any forward and inverse scattering
problem.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Super-resolving sparse observations in partial differential equations: A
physics-constrained convolutional neural network approach [6.85316573653194]
We propose a physics-constrained convolutional neural network (CNN) to infer the high-resolution solution from sparse observations of nonlinear partial differential equations.
We show that, by constraining prior physical knowledge in the dataset, we can infer the unresolved physical dynamics without using the high-resolution training.
arXiv Detail & Related papers (2023-06-19T15:00:04Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Compressive sensing with un-trained neural networks: Gradient descent
finds the smoothest approximation [60.80172153614544]
Un-trained convolutional neural networks have emerged as highly successful tools for image recovery and restoration.
We show that an un-trained convolutional neural network can approximately reconstruct signals and images that are sufficiently structured, from a near minimal number of random measurements.
arXiv Detail & Related papers (2020-05-07T15:57:25Z) - Neural Network Solutions to Differential Equations in Non-Convex
Domains: Solving the Electric Field in the Slit-Well Microfluidic Device [1.7188280334580193]
The neural network method is used to approximate the electric potential and corresponding electric field in a slit-well microfluidic device.
metrics, deep neural networks significantly outperform shallow neural networks.
arXiv Detail & Related papers (2020-04-25T21:20:03Z) - Understanding and mitigating gradient pathologies in physics-informed
neural networks [2.1485350418225244]
This work focuses on the effectiveness of physics-informed neural networks in predicting outcomes of physical systems and discovering hidden physics from noisy data.
We present a learning rate annealing algorithm that utilizes gradient statistics during model training to balance the interplay between different terms in composite loss functions.
We also propose a novel neural network architecture that is more resilient to such gradient pathologies.
arXiv Detail & Related papers (2020-01-13T21:23:49Z) - Mean-Field and Kinetic Descriptions of Neural Differential Equations [0.0]
In this work we focus on a particular class of neural networks, i.e. the residual neural networks.
We analyze steady states and sensitivity with respect to the parameters of the network, namely the weights and the bias.
A modification of the microscopic dynamics, inspired by residual neural networks, leads to a Fokker-Planck formulation of the network.
arXiv Detail & Related papers (2020-01-07T13:41:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.