Physics-Driven Neural Compensation For Electrical Impedance Tomography
- URL: http://arxiv.org/abs/2504.18067v2
- Date: Mon, 28 Apr 2025 05:14:57 GMT
- Title: Physics-Driven Neural Compensation For Electrical Impedance Tomography
- Authors: Chuyu Wang, Huiting Deng, Dong Liu,
- Abstract summary: Electrical Impedance Tomography (EIT) provides a non-invasive, portable imaging modality with significant potential in medical and industrial applications.<n>EIT faces two primary challenges: the ill-posed nature of its inverse problem and the spatially variable, location-dependent sensitivity distribution.<n>We propose PhyNC (Physics-driven Neural Compensation), an unsupervised deep learning framework that incorporates the physical principles of EIT.
- Score: 7.256725037878305
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Electrical Impedance Tomography (EIT) provides a non-invasive, portable imaging modality with significant potential in medical and industrial applications. Despite its advantages, EIT encounters two primary challenges: the ill-posed nature of its inverse problem and the spatially variable, location-dependent sensitivity distribution. Traditional model-based methods mitigate ill-posedness through regularization but overlook sensitivity variability, while supervised deep learning approaches require extensive training data and lack generalization. Recent developments in neural fields have introduced implicit regularization techniques for image reconstruction, but these methods typically neglect the physical principles underlying EIT, thus limiting their effectiveness. In this study, we propose PhyNC (Physics-driven Neural Compensation), an unsupervised deep learning framework that incorporates the physical principles of EIT. PhyNC addresses both the ill-posed inverse problem and the sensitivity distribution by dynamically allocating neural representational capacity to regions with lower sensitivity, ensuring accurate and balanced conductivity reconstructions. Extensive evaluations on both simulated and experimental data demonstrate that PhyNC outperforms existing methods in terms of detail preservation and artifact resistance, particularly in low-sensitivity regions. Our approach enhances the robustness of EIT reconstructions and provides a flexible framework that can be adapted to other imaging modalities with similar challenges.
Related papers
- SDEIT: Semantic-Driven Electrical Impedance Tomography [7.872153285062159]
We introduce SDEIT, a novel semantic-driven framework that integrates Stable Diffusion 3.5 into EIT.
By coupling an implicit neural representation (INR) network with a plug-and-play optimization scheme, SDEIT improves structural consistency and recovers fine details.
This work opens a new pathway for integrating multimodal priors into ill-posed inverse problems like EIT.
arXiv Detail & Related papers (2025-04-05T14:08:58Z) - Towards Understanding the Benefits of Neural Network Parameterizations in Geophysical Inversions: A Study With Neural Fields [1.7396556690675236]
In this work, we employ neural fields, which use neural networks to map a coordinate to the corresponding physical property value at that coordinate, in a test-time learning manner.<n>For a test-time learning method, the weights are learned during the inversion, as compared to traditional approaches which require a network to be trained using a training data set.
arXiv Detail & Related papers (2025-03-21T19:32:52Z) - Diff-INR: Generative Regularization for Electrical Impedance Tomography [6.7667436349597985]
Electrical Impedance Tomography (EIT) reconstructs conductivity distributions within a body from boundary measurements.
EIT reconstruction is hindered by its ill-posed nonlinear inverse problem, which complicates accurate results.
We propose Diff-INR, a novel method that combines generative regularization with Implicit Neural Representations (INR) through a diffusion model.
arXiv Detail & Related papers (2024-09-06T14:21:23Z) - A Two-Stage Imaging Framework Combining CNN and Physics-Informed Neural Networks for Full-Inverse Tomography: A Case Study in Electrical Impedance Tomography (EIT) [5.772638266457322]
Electrical Impedance Tomography is a highly ill-posed inverse problem.<n>We propose a two-stage hybrid learning framework that combines Convolutional Neural Networks (CNNs) and PINNs.<n>This framework integrates data-driven and model-driven paradigms, blending supervised and unsupervised learning to reconstruct conductivity.
arXiv Detail & Related papers (2024-07-25T02:48:22Z) - Electrical Impedance Tomography: A Fair Comparative Study on Deep
Learning and Analytic-based Approaches [2.7392924984179348]
Electrical Impedance Tomography (EIT) is a powerful imaging technique with diverse applications.
The EIT inverse problem is about inferring the internal conductivity distribution of an object from measurements taken on its boundary.
Recent years have witnessed significant progress, driven by innovations in analytic-based approaches and deep learning.
arXiv Detail & Related papers (2023-10-28T08:45:51Z) - Unsupervised Domain Transfer with Conditional Invertible Neural Networks [83.90291882730925]
We propose a domain transfer approach based on conditional invertible neural networks (cINNs)
Our method inherently guarantees cycle consistency through its invertible architecture, and network training can efficiently be conducted with maximum likelihood.
Our method enables the generation of realistic spectral data and outperforms the state of the art on two downstream classification tasks.
arXiv Detail & Related papers (2023-03-17T18:00:27Z) - fMRI from EEG is only Deep Learning away: the use of interpretable DL to
unravel EEG-fMRI relationships [68.8204255655161]
We present an interpretable domain grounded solution to recover the activity of several subcortical regions from multichannel EEG data.
We recover individual spatial and time-frequency patterns of scalp EEG predictive of the hemodynamic signal in the subcortical nuclei.
arXiv Detail & Related papers (2022-10-23T15:11:37Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Data-driven generation of plausible tissue geometries for realistic
photoacoustic image synthesis [53.65837038435433]
Photoacoustic tomography (PAT) has the potential to recover morphological and functional tissue properties.
We propose a novel approach to PAT data simulation, which we refer to as "learning to simulate"
We leverage the concept of Generative Adversarial Networks (GANs) trained on semantically annotated medical imaging data to generate plausible tissue geometries.
arXiv Detail & Related papers (2021-03-29T11:30:18Z) - Gradient Starvation: A Learning Proclivity in Neural Networks [97.02382916372594]
Gradient Starvation arises when cross-entropy loss is minimized by capturing only a subset of features relevant for the task.
This work provides a theoretical explanation for the emergence of such feature imbalance in neural networks.
arXiv Detail & Related papers (2020-11-18T18:52:08Z) - Limited-angle tomographic reconstruction of dense layered objects by
dynamical machine learning [68.9515120904028]
Limited-angle tomography of strongly scattering quasi-transparent objects is a challenging, highly ill-posed problem.
Regularizing priors are necessary to reduce artifacts by improving the condition of such problems.
We devised a recurrent neural network (RNN) architecture with a novel split-convolutional gated recurrent unit (SC-GRU) as the building block.
arXiv Detail & Related papers (2020-07-21T11:48:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.