Extension and neural operator approximation of the electrical impedance tomography inverse map
- URL: http://arxiv.org/abs/2511.20361v1
- Date: Tue, 25 Nov 2025 14:43:13 GMT
- Title: Extension and neural operator approximation of the electrical impedance tomography inverse map
- Authors: Maarten V. de Hoop, Nikola B. Kovachki, Matti Lassas, Nicholas H. Nelsen,
- Abstract summary: This paper considers the problem of noise-robust neural operator approximation for the solution map of Caldern's inverse conductivity problem.<n>The boundary measurements are realized as a noisy perturbation of the Neumann-to-Dirichlet map's integral kernel.<n>The resulting extension shares the same stability properties as the original inverse map from kernels to conductivities, but is now amenable to neural operator approximation.
- Score: 15.065275438824008
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper considers the problem of noise-robust neural operator approximation for the solution map of Calderón's inverse conductivity problem. In this continuum model of electrical impedance tomography (EIT), the boundary measurements are realized as a noisy perturbation of the Neumann-to-Dirichlet map's integral kernel. The theoretical analysis proceeds by extending the domain of the inversion operator to a Hilbert space of kernel functions. The resulting extension shares the same stability properties as the original inverse map from kernels to conductivities, but is now amenable to neural operator approximation. Numerical experiments demonstrate that Fourier neural operators excel at reconstructing infinite-dimensional piecewise constant and lognormal conductivities in noisy setups both within and beyond the theory's assumptions. The methodology developed in this paper for EIT exemplifies a broader strategy for addressing nonlinear inverse problems with a noise-aware operator learning framework.
Related papers
- Hilbert Neural Operator: Operator Learning in the Analytic Signal Domain [0.0]
We introduce the textbfHilbert Neural Operator (HNO), a new neural operator architecture to address some advantages.<n>HNO operates by first mapping the input signal to its analytic representation via the Hilbert transform.<n>We hypothesize that this architecture enables HNO to model operators more effectively for causal, phase-sensitive, and non-stationary systems.
arXiv Detail & Related papers (2025-08-06T21:12:15Z) - Stability Bounds for the Unfolded Forward-Backward Algorithm [21.529323131078954]
We consider a neural network architecture designed to solve inverse problems where the degradation operator is linear and known.<n> robustness of this inversion method to input perturbations is analyzed theoretically.<n>A key novelty of our work lies in examining the robustness of the proposed network to perturbations in its bias.
arXiv Detail & Related papers (2024-12-23T11:55:41Z) - A DeepONet for inverting the Neumann-to-Dirichlet Operator in Electrical Impedance Tomography: An approximation theoretic perspective and numerical results [2.209921757303168]
In this work, we consider the non-invasive medical imaging modality of Electrical Impedance Tomography.<n>The problem is to recover the conductivity in a medium from a set of data that arises out of a current-to-voltage map.<n>We formulate this inverse problem as an operator-learning problem where the goal is to learn the implicitly defined operator-to-function map.
arXiv Detail & Related papers (2024-07-24T11:34:24Z) - Stable Nonconvex-Nonconcave Training via Linear Interpolation [51.668052890249726]
This paper presents a theoretical analysis of linearahead as a principled method for stabilizing (large-scale) neural network training.
We argue that instabilities in the optimization process are often caused by the nonmonotonicity of the loss landscape and show how linear can help by leveraging the theory of nonexpansive operators.
arXiv Detail & Related papers (2023-10-20T12:45:12Z) - GibbsDDRM: A Partially Collapsed Gibbs Sampler for Solving Blind Inverse
Problems with Denoising Diffusion Restoration [64.8770356696056]
We propose GibbsDDRM, an extension of Denoising Diffusion Restoration Models (DDRM) to a blind setting in which the linear measurement operator is unknown.
The proposed method is problem-agnostic, meaning that a pre-trained diffusion model can be applied to various inverse problems without fine-tuning.
arXiv Detail & Related papers (2023-01-30T06:27:48Z) - Transformer Meets Boundary Value Inverse Problems [4.165221477234755]
Transformer-based deep direct sampling method is proposed for solving a class of boundary value inverse problem.
A real-time reconstruction is achieved by evaluating the learned inverse operator between carefully designed data and reconstructed images.
arXiv Detail & Related papers (2022-09-29T17:45:25Z) - Learning Dynamical Systems via Koopman Operator Regression in
Reproducing Kernel Hilbert Spaces [52.35063796758121]
We formalize a framework to learn the Koopman operator from finite data trajectories of the dynamical system.
We link the risk with the estimation of the spectral decomposition of the Koopman operator.
Our results suggest RRR might be beneficial over other widely used estimators.
arXiv Detail & Related papers (2022-05-27T14:57:48Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Convex Analysis of the Mean Field Langevin Dynamics [49.66486092259375]
convergence rate analysis of the mean field Langevin dynamics is presented.
$p_q$ associated with the dynamics allows us to develop a convergence theory parallel to classical results in convex optimization.
arXiv Detail & Related papers (2022-01-25T17:13:56Z) - Kernel-based approximation of the Koopman generator and Schr\"odinger
operator [0.3093890460224435]
We show how eigenfunctions can be estimated by solving auxiliary matrix eigenvalue problems.
The resulting algorithms are applied to molecular dynamics and quantum chemistry examples.
arXiv Detail & Related papers (2020-05-27T08:23:29Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.