Equivariant Deep Equilibrium Models for Imaging Inverse Problems
- URL: http://arxiv.org/abs/2511.18667v1
- Date: Mon, 24 Nov 2025 00:43:54 GMT
- Title: Equivariant Deep Equilibrium Models for Imaging Inverse Problems
- Authors: Alexander Mehta, Ruangrawee Kitichotkul, Vivek K Goyal, Julián Tachella,
- Abstract summary: Equivariant imaging (EI) enables training signal reconstruction models without requiring ground truth data.<n>We show that backpropagation can be implemented modularly, simplifying training.<n>We find evidence that EI-trained DEQs approximate the proximal map of an invariant prior.
- Score: 50.91616288661183
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Equivariant imaging (EI) enables training signal reconstruction models without requiring ground truth data by leveraging signal symmetries. Deep equilibrium models (DEQs) are a powerful class of neural networks where the output is a fixed point of a learned operator. However, training DEQs with complex EI losses requires implicit differentiation through fixed-point computations, whose implementation can be challenging. We show that backpropagation can be implemented modularly, simplifying training. Experiments demonstrate that DEQs trained with implicit differentiation outperform those trained with Jacobian-free backpropagation and other baseline methods. Additionally, we find evidence that EI-trained DEQs approximate the proximal map of an invariant prior.
Related papers
- GenUQ: Predictive Uncertainty Estimates via Generative Hyper-Networks [0.0]
Operator learning is a generalization of regression to mappings between functions.<n>It has already found applications in several areas such as modeling sea ice, combustion, and atmospheric physics.<n>We introduce GenUQ, a measure-theoretic approach to UQ that avoids constructing a likelihood by introducing a generative hyper-network model.
arXiv Detail & Related papers (2025-09-25T21:19:03Z) - Strategies for training point distributions in physics-informed neural networks [0.0]
Physics-informed neural networks approach the approximation of differential equations by directly incorporating their structure and given conditions in a loss function.<n>In this paper, we investigate and evaluate a core component of the approach, namely the training point distribution.<n>The results show the impact of the training point on the solution accuracy and we find evidence that they are connected to the characteristics of the differential equation.
arXiv Detail & Related papers (2025-08-17T09:40:49Z) - Deep Equilibrium models for Poisson Imaging Inverse problems via Mirror Descent [7.248102801711294]
Deep Equilibrium Models (DEQs) are implicit neural networks with fixed points.<n>We introduce a novel DEQ formulation based on Mirror Descent defined in terms of a tailored non-Euclidean geometry.<n>We propose computational strategies that enable both efficient training and fully parameter-free inference.
arXiv Detail & Related papers (2025-07-15T16:33:01Z) - Rao-Blackwell Gradient Estimators for Equivariant Denoising Diffusion [55.95767828747407]
In domains such as molecular and protein generation, physical systems exhibit inherent symmetries that are critical to model.<n>We present a framework that reduces training variance and provides a provably lower-variance gradient estimator.<n>We also present a practical implementation of this estimator incorporating the loss and sampling procedure through a method we call Orbit Diffusion.
arXiv Detail & Related papers (2025-02-14T03:26:57Z) - Diffeomorphic Latent Neural Operators for Data-Efficient Learning of Solutions to Partial Differential Equations [5.308435208832696]
A computed approximation of the solution operator to a system of partial differential equations (PDEs) is needed in various areas of science and engineering.<n>We propose that in order to learn a PDE solution operator that can generalize across multiple domains without needing to sample enough data expressive enough, we can train instead a latent neural operator on just a few ground truth solution fields.
arXiv Detail & Related papers (2024-11-27T03:16:00Z) - Improving Equivariant Model Training via Constraint Relaxation [31.507956579770088]
We propose a novel framework for improving the optimization of such models by relaxing the hard equivariance constraint during training.<n>We provide experimental results on different state-of-the-art network architectures, demonstrating how this training framework can result in equivariant models with improved generalization performance.
arXiv Detail & Related papers (2024-08-23T17:35:08Z) - DeltaPhi: Physical States Residual Learning for Neural Operators in Data-Limited PDE Solving [54.605760146540234]
DeltaPhi is a novel learning framework that transforms the PDE solving task from learning direct input-output mappings to learning the residuals between similar physical states.<n>Extensive experiments demonstrate consistent and significant improvements across diverse physical systems.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Variational operator learning: A unified paradigm marrying training
neural operators and solving partial differential equations [9.148052787201797]
We propose a novel paradigm that provides a unified framework of training neural operators and solving PDEs with the variational form.
With a label-free training set and a 5-label-only shift set, VOL learns solution operators with its test errors decreasing in a power law with respect to the amount of unlabeled data.
arXiv Detail & Related papers (2023-04-09T13:20:19Z) - IB-UQ: Information bottleneck based uncertainty quantification for
neural function regression and neural operator learning [11.5992081385106]
We propose a novel framework for uncertainty quantification via information bottleneck (IB-UQ) for scientific machine learning tasks.
We incorporate the bottleneck by a confidence-aware encoder, which encodes inputs into latent representations according to the confidence of the input data.
We also propose a data augmentation based information bottleneck objective which can enhance the quality of the extrapolation uncertainty.
arXiv Detail & Related papers (2023-02-07T05:56:42Z) - Improving the Sample-Complexity of Deep Classification Networks with
Invariant Integration [77.99182201815763]
Leveraging prior knowledge on intraclass variance due to transformations is a powerful method to improve the sample complexity of deep neural networks.
We propose a novel monomial selection algorithm based on pruning methods to allow an application to more complex problems.
We demonstrate the improved sample complexity on the Rotated-MNIST, SVHN and CIFAR-10 datasets.
arXiv Detail & Related papers (2022-02-08T16:16:11Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Autoencoding Variational Autoencoder [56.05008520271406]
We study the implications of this behaviour on the learned representations and also the consequences of fixing it by introducing a notion of self consistency.
We show that encoders trained with our self-consistency approach lead to representations that are robust (insensitive) to perturbations in the input introduced by adversarial attacks.
arXiv Detail & Related papers (2020-12-07T14:16:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.