Infinite dimensional generative sensing
- URL: http://arxiv.org/abs/2603.03196v1
- Date: Tue, 03 Mar 2026 17:52:18 GMT
- Title: Infinite dimensional generative sensing
- Authors: Paolo Angella, Vito Paolo Pastore, Matteo Santacesaria,
- Abstract summary: This work presents a rigorous framework for generative compressed sensing in Hilbert spaces.<n>Thanks to a generalization of the Restricted Isometry Property, we show that stable recovery holds when the number of measurements is proportional to the prior's intrinsic dimension.
- Score: 0.9749560288448113
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep generative models have become a standard for modeling priors for inverse problems, going beyond classical sparsity-based methods. However, existing theoretical guarantees are mostly confined to finite-dimensional vector spaces, creating a gap when the physical signals are modeled as functions in Hilbert spaces. This work presents a rigorous framework for generative compressed sensing in Hilbert spaces. We extend the notion of local coherence in an infinite-dimensional setting, to derive optimal, resolution-independent sampling distributions. Thanks to a generalization of the Restricted Isometry Property, we show that stable recovery holds when the number of measurements is proportional to the prior's intrinsic dimension (up to logarithmic factors), independent of the ambient dimension. Finally, numerical experiments on the Darcy flow equation validate our theoretical findings and demonstrate that in severely undersampled regimes, employing lower-resolution generators acts as an implicit regularizer, improving reconstruction stability.
Related papers
- Deep Sequence Modeling with Quantum Dynamics: Language as a Wave Function [0.0]
We introduce a sequence modeling framework in which the latent state is a complex-valued wave function evolving on a finite-dimensional Hilbert space under a learned, time-dependent Hamiltonian.<n> Token probabilities are extracted using the Born rule, a quadratic measurement operator that couples magnitudes and relative phases.<n>We derive a continuity equation for the latent probability mass, yielding conserved pairwise currents that serve as a built-in diagnostic.
arXiv Detail & Related papers (2026-02-24T23:42:18Z) - Neural Optimal Transport in Hilbert Spaces: Characterizing Spurious Solutions and Gaussian Smoothing [11.456242421204296]
In non-regular settings, Semi-dual Neural OT often generates spurious solutions that fail to accurately capture target distributions.<n>We analytically characterize this solution problem using the framework of regular measures, which generalize Lebesgue absolute continuity in finite dimensions.<n>To resolve ill-posedness, we extend the semi-dual framework via a Gaussian smoothing strategy based on Brownian motion.
arXiv Detail & Related papers (2026-02-15T10:27:09Z) - Flow Straight and Fast in Hilbert Space: Functional Rectified Flow [14.747544527069804]
We establish a rigorous functional formulation of rectified flow in an infinite-dimensional Hilbert space.<n>We show that this framework extends naturally to functional flow matching and functional probability flow ODEs.<n>Our method achieves superior performance compared to existing functional generative models.
arXiv Detail & Related papers (2025-09-12T16:18:16Z) - Preconditioned Langevin Dynamics with Score-Based Generative Models for Infinite-Dimensional Linear Bayesian Inverse Problems [4.2223436389469144]
Langevin dynamics driven by score-based generative models (SGMs) acting as priors, formulated directly in function space.<n>We derive, for the first time, error estimates that explicitly depend on the approximation error of the score.<n>As a consequence, we obtain sufficient conditions for global convergence in Kullback-Leibler divergence on the underlying function space.
arXiv Detail & Related papers (2025-05-23T18:12:04Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Normalizing flows for lattice gauge theory in arbitrary space-time
dimension [135.04925500053622]
Applications of normalizing flows to the sampling of field configurations in lattice gauge theory have so far been explored almost exclusively in two space-time dimensions.
We discuss masked autoregressive with tractable and unbiased Jacobian determinants, a key ingredient for scalable and exact flow-based sampling algorithms.
For concreteness, results from a proof-of-principle application to SU(3) gauge theory in four space-time dimensions are reported.
arXiv Detail & Related papers (2023-05-03T19:54:04Z) - Continuous percolation in a Hilbert space for a large system of qubits [58.720142291102135]
The percolation transition is defined through the appearance of the infinite cluster.
We show that the exponentially increasing dimensionality of the Hilbert space makes its covering by finite-size hyperspheres inefficient.
Our approach to the percolation transition in compact metric spaces may prove useful for its rigorous treatment in other contexts.
arXiv Detail & Related papers (2022-10-15T13:53:21Z) - Unveiling the Latent Space Geometry of Push-Forward Generative Models [24.025975236316846]
Many deep generative models are defined as a push-forward of a Gaussian measure by a continuous generator, such as Generative Adversarial Networks (GANs) or Variational Auto-Encoders (VAEs)
This work explores the latent space of such deep generative models.
A key issue with these models is their tendency to output samples outside of the support of the target distribution when learning disconnected distributions.
arXiv Detail & Related papers (2022-07-21T15:29:35Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Sample Complexity Bounds for 1-bit Compressive Sensing and Binary Stable
Embeddings with Generative Priors [52.06292503723978]
Motivated by advances in compressive sensing with generative models, we study the problem of 1-bit compressive sensing with generative models.
We first consider noiseless 1-bit measurements, and provide sample complexity bounds for approximate recovery under i.i.d.Gaussian measurements.
We demonstrate that the Binary $epsilon$-Stable Embedding property, which characterizes the robustness of the reconstruction to measurement errors and noise, also holds for 1-bit compressive sensing with Lipschitz continuous generative models.
arXiv Detail & Related papers (2020-02-05T09:44:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.