Learned Regularization for Microwave Tomography
- URL: http://arxiv.org/abs/2508.08114v1
- Date: Mon, 11 Aug 2025 15:54:58 GMT
- Title: Learned Regularization for Microwave Tomography
- Authors: Bowen Tong, Hao Chen, Shaorui Guo, Dong Liu,
- Abstract summary: Single-Step Diffusion Regularization (SSD-Reg) is a novel approach that embeds diffusion priors into the iterative reconstruction process.<n>SSD-Reg maintains fidelity to both the governing physics and learned structural details.<n>It provides a flexible and effective solution for tackling the ill-posedness inherent in functional image reconstruction.
- Score: 7.792752191078406
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Microwave Tomography (MWT) aims to reconstruct the dielectric properties of tissues from measured scattered electromagnetic fields. This inverse problem is highly nonlinear and ill-posed, posing significant challenges for conventional optimization-based methods, which, despite being grounded in physical models, often fail to recover fine structural details. Recent deep learning strategies, including end-to-end and post-processing networks, have improved reconstruction quality but typically require large paired training datasets and may struggle to generalize. To overcome these limitations, we propose a physics-informed hybrid framework that integrates diffusion models as learned regularization within a data-consistency-driven variational scheme. Specifically, we introduce Single-Step Diffusion Regularization (SSD-Reg), a novel approach that embeds diffusion priors into the iterative reconstruction process, enabling the recovery of complex anatomical structures without the need for paired data. SSD-Reg maintains fidelity to both the governing physics and learned structural distributions, improving accuracy, stability, and robustness. Extensive experiments demonstrate that SSD-Reg, implemented as a Plug-and-Play (PnP) module, provides a flexible and effective solution for tackling the ill-posedness inherent in functional image reconstruction.
Related papers
- MEMTS: Internalizing Domain Knowledge via Parameterized Memory for Retrieval-Free Domain Adaptation of Time Series Foundation Models [51.506429027626005]
Memory for Time Series (MEMTS) is a lightweight and plug-and-play method for retrieval-free domain adaptation in time series forecasting.<n>Key component of MEMTS is a Knowledge Persistence Module (KPM), which internalizes domain-specific temporal dynamics.<n>This paradigm shift enables MEMTS to achieve accurate domain adaptation with constant-time inference and near-zero latency.
arXiv Detail & Related papers (2026-02-14T14:00:06Z) - A Physics-Informed U-net-LSTM Network for Data-Driven Seismic Response Modeling of Structures [0.0]
Recent developments in deep learning have shown promise in reducing the computational cost of nonlinear seismic analysis of structures.<n>We propose a novel Physics Informed U Net LSTM framework that integrates physical laws with deep learning to enhance both accuracy and efficiency.
arXiv Detail & Related papers (2025-11-26T11:05:42Z) - Flow-Matching Guided Deep Unfolding for Hyperspectral Image Reconstruction [53.26903617819014]
Flow-Matching-guided Unfolding network (FMU) is first to integrate flow matching into HSI reconstruction.<n>To further strengthen the learned dynamics, we introduce a mean velocity loss.<n>Experiments on both simulated and real datasets show that FMU significantly outperforms existing approaches in reconstruction quality.
arXiv Detail & Related papers (2025-10-02T11:32:00Z) - A Residual Guided strategy with Generative Adversarial Networks in training Physics-Informed Transformer Networks [8.614387766858496]
We propose a novel Residual Guided Training strategy for Physics-In Transformer via Generative Adrative Network (GAN)<n>Our framework integrates a Transformer to inherently capture temporal correlations through autoregressive processing, coupled with a residual-aware GAN.<n>Experiments on the Allen-Cahn-Gordon, and Navier-Stokes equations demonstrate significant improvements, relative relative MSE reductions of up three orders of magnitude compared to baseline methods.
arXiv Detail & Related papers (2025-07-15T03:45:42Z) - AlphaFold Database Debiasing for Robust Inverse Folding [58.792020809180336]
We introduce a Debiasing Structure AutoEncoder (DeSAE) that learns to reconstruct native-like conformations from intentionally corrupted backbone geometries.<n>At inference, applying DeSAE to AFDB structures produces debiased structures that significantly improve inverse folding performance.
arXiv Detail & Related papers (2025-06-10T02:25:31Z) - High-Fidelity Scientific Simulation Surrogates via Adaptive Implicit Neural Representations [51.90920900332569]
Implicit neural representations (INRs) offer a compact and continuous framework for modeling spatially structured data.<n>Recent approaches address this by introducing additional features along rigid geometric structures.<n>We propose a simple yet effective alternative: Feature-Adaptive INR (FA-INR)
arXiv Detail & Related papers (2025-06-07T16:45:17Z) - Weight Spectra Induced Efficient Model Adaptation [54.8615621415845]
Fine-tuning large-scale foundation models incurs prohibitive computational costs.<n>We show that fine-tuning predominantly amplifies the top singular values while leaving the remainder largely intact.<n>We propose a novel method that leverages learnable rescaling of top singular directions.
arXiv Detail & Related papers (2025-05-29T05:03:29Z) - Restoration Score Distillation: From Corrupted Diffusion Pretraining to One-Step High-Quality Generation [82.39763984380625]
We propose textitRestoration Score Distillation (RSD), a principled generalization of Denoising Score Distillation (DSD)<n>RSD accommodates a broader range of corruption types, such as blurred, incomplete, or low-resolution images.<n>It consistently surpasses its teacher model across diverse restoration tasks on both natural and scientific datasets.
arXiv Detail & Related papers (2025-05-19T17:21:03Z) - SDEIT: Semantic-Driven Electrical Impedance Tomography [7.872153285062159]
We introduce SDEIT, a novel semantic-driven framework that integrates Stable Diffusion 3.5 into EIT.<n>By coupling an implicit neural representation (INR) network with a plug-and-play optimization scheme, SDEIT improves structural consistency and recovers fine details.<n>This work opens a new pathway for integrating multimodal priors into ill-posed inverse problems like EIT.
arXiv Detail & Related papers (2025-04-05T14:08:58Z) - Model Hemorrhage and the Robustness Limits of Large Language Models [119.46442117681147]
Large language models (LLMs) demonstrate strong performance across natural language processing tasks, yet undergo significant performance degradation when modified for deployment.<n>We define this phenomenon as model hemorrhage - performance decline caused by parameter alterations and architectural changes.
arXiv Detail & Related papers (2025-03-31T10:16:03Z) - Generalized Factor Neural Network Model for High-dimensional Regression [50.554377879576066]
We tackle the challenges of modeling high-dimensional data sets with latent low-dimensional structures hidden within complex, non-linear, and noisy relationships.<n>Our approach enables a seamless integration of concepts from non-parametric regression, factor models, and neural networks for high-dimensional regression.
arXiv Detail & Related papers (2025-02-16T23:13:55Z) - Physics-Driven Autoregressive State Space Models for Medical Image Reconstruction [5.208643222679356]
We propose MambaRoll, a physics-driven autoregressive state space model (SSM) for high-fidelity and efficient image reconstruction.<n>MambaRoll employs an unrolled architecture where each cascade autoregressively predicts finer-scale feature maps on coarser-scale representations.<n> Demonstrations on accelerated MRI and sparse-view CT reconstructions show that MambaRoll consistently outperforms state-of-the-art CNN-, transformer-, and SSM-based methods.
arXiv Detail & Related papers (2024-12-12T14:59:56Z) - Diff-INR: Generative Regularization for Electrical Impedance Tomography [6.7667436349597985]
Electrical Impedance Tomography (EIT) reconstructs conductivity distributions within a body from boundary measurements.
EIT reconstruction is hindered by its ill-posed nonlinear inverse problem, which complicates accurate results.
We propose Diff-INR, a novel method that combines generative regularization with Implicit Neural Representations (INR) through a diffusion model.
arXiv Detail & Related papers (2024-09-06T14:21:23Z) - Physics-Informed Machine Learning for Seismic Response Prediction OF Nonlinear Steel Moment Resisting Frame Structures [6.483318568088176]
PiML method integrates scientific principles and physical laws into deep neural networks to model seismic responses of nonlinear structures.
Manipulating the equation of motion helps learn system nonlinearities and confines solutions within physically interpretable results.
Result handles complex data better than existing physics-guided LSTM models and outperforms other non-physics data-driven networks.
arXiv Detail & Related papers (2024-02-28T02:16:03Z) - Iterative Soft Shrinkage Learning for Efficient Image Super-Resolution [91.3781512926942]
Image super-resolution (SR) has witnessed extensive neural network designs from CNN to transformer architectures.
This work investigates the potential of network pruning for super-resolution iteration to take advantage of off-the-shelf network designs and reduce the underlying computational overhead.
We propose a novel Iterative Soft Shrinkage-Percentage (ISS-P) method by optimizing the sparse structure of a randomly network at each and tweaking unimportant weights with a small amount proportional to the magnitude scale on-the-fly.
arXiv Detail & Related papers (2023-03-16T21:06:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.