Stability and Concentration in Nonlinear Inverse Problems with Block-Structured Parameters: Lipschitz Geometry, Identifiability, and an Application to Gaussian Splatting
- URL: http://arxiv.org/abs/2602.09415v1
- Date: Tue, 10 Feb 2026 05:11:06 GMT
- Title: Stability and Concentration in Nonlinear Inverse Problems with Block-Structured Parameters: Lipschitz Geometry, Identifiability, and an Application to Gaussian Splatting
- Authors: Joe-Mei Feng, Hsin-Hsiung Kao,
- Abstract summary: We develop an operator-theoretic framework for stability and statistical concentration in nonlinear inverse problems with block-structured parameters.<n>Overall, the analysis characterizes operator-level limits for a broad class of high-dimensional nonlinear inverse problems arising in modern imaging and differentiable rendering.
- Score: 0.552480439325792
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We develop an operator-theoretic framework for stability and statistical concentration in nonlinear inverse problems with block-structured parameters. Under a unified set of assumptions combining blockwise Lipschitz geometry, local identifiability, and sub-Gaussian noise, we establish deterministic stability inequalities, global Lipschitz bounds for least-squares misfit functionals, and nonasymptotic concentration estimates. These results yield high-probability parameter error bounds that are intrinsic to the forward operator and independent of any specific reconstruction algorithm. As a concrete instantiation, we verify that the Gaussian Splatting rendering operator satisfies the proposed assumptions and derive explicit constants governing its Lipschitz continuity and resolution-dependent observability. This leads to a fundamental stability--resolution tradeoff, showing that estimation error is inherently constrained by the ratio between image resolution and model complexity. Overall, the analysis characterizes operator-level limits for a broad class of high-dimensional nonlinear inverse problems arising in modern imaging and differentiable rendering.
Related papers
- The Price of Robustness: Stable Classifiers Need Overparameterization [17.335490896384265]
We establish a generalization bound for finite function classes that improves inversely with class stability.<n>We derive as a corollary a law of robustness for classification that extends the results of Bubeck and Sellke.<n>Experiments support our theory, while traditional norm-based measures remain largely uninformative.
arXiv Detail & Related papers (2026-03-03T09:47:06Z) - Stability and Generalization of Push-Sum Based Decentralized Optimization over Directed Graphs [55.77845440440496]
Push-based decentralized communication enables optimization over communication networks, where information exchange may be asymmetric.<n>We develop a unified uniform-stability framework for the Gradient Push (SGP) algorithm.<n>A key technical ingredient is an imbalance-aware generalization bound through two quantities.
arXiv Detail & Related papers (2026-02-24T05:32:03Z) - Optimizing Parallel Schemes with Lyapunov Exponents and kNN-LLE Estimation [0.0]
We present a unified analytical-data-driven methodology for identifying, measuring, and reducing such instabilities in inverse parallel solvers.<n>On the theoretical side, we derive stability and bifurcation characterizations of the underlying iterative maps.<n>On the computational side, we introduce a micro-series pipeline based on kNN-driven estimation of the local largest Lyapunov exponent.
arXiv Detail & Related papers (2026-01-20T05:09:52Z) - Deep Eigenspace Network and Its Application to Parametric Non-selfadjoint Eigenvalue Problems [0.12744523252873352]
We consider operator learning for efficiently solving parametric non-selfadjoint eigenvalue problems.<n>We introduce a hybrid framework that learns the stable invariant eigensubspace mapping rather than individual eigenfunctions.
arXiv Detail & Related papers (2025-12-23T05:20:22Z) - Revisiting Zeroth-Order Optimization: Minimum-Variance Two-Point Estimators and Directionally Aligned Perturbations [57.179679246370114]
We identify the distribution of random perturbations that minimizes the estimator's variance as the perturbation stepsize tends to zero.<n>Our findings reveal that such desired perturbations can align directionally with the true gradient, instead of maintaining a fixed length.
arXiv Detail & Related papers (2025-10-22T19:06:39Z) - Non-Asymptotic Stability and Consistency Guarantees for Physics-Informed Neural Networks via Coercive Operator Analysis [0.0]
We present a unified theoretical framework for analyzing the stability and consistency of Physics-Informed Neural Networks (PINNs)<n>PINNs approximate solutions to partial differential equations (PDEs) by minimizing residual losses over sampled collocation and boundary points.<n>We formalize both operator-level and variational notions of consistency, proving that residual minimization in Sobolev norms leads to convergence in energy and uniform norms under mild regularity.
arXiv Detail & Related papers (2025-06-16T14:41:15Z) - Wasserstein Distributionally Robust Nonparametric Regression [9.65010022854885]
This paper studies the generalization properties of Wasserstein distributionally robust nonparametric estimators.<n>We establish non-asymptotic error bounds for the excess local worst-case risk.<n>The robustness of the proposed estimator is evaluated through simulation studies and illustrated with an application to the MNIST dataset.
arXiv Detail & Related papers (2025-05-12T18:07:37Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.<n>We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Likelihood Ratio Confidence Sets for Sequential Decision Making [51.66638486226482]
We revisit the likelihood-based inference principle and propose to use likelihood ratios to construct valid confidence sequences.
Our method is especially suitable for problems with well-specified likelihoods.
We show how to provably choose the best sequence of estimators and shed light on connections to online convex optimization.
arXiv Detail & Related papers (2023-11-08T00:10:21Z) - Optimal variance-reduced stochastic approximation in Banach spaces [114.8734960258221]
We study the problem of estimating the fixed point of a contractive operator defined on a separable Banach space.
We establish non-asymptotic bounds for both the operator defect and the estimation error.
arXiv Detail & Related papers (2022-01-21T02:46:57Z) - On the Stability Properties and the Optimization Landscape of Training
Problems with Squared Loss for Neural Networks and General Nonlinear Conic
Approximation Schemes [0.0]
We study the optimization landscape and the stability properties of training problems with squared loss for neural networks and general nonlinear conic approximation schemes.
We prove that the same effects that are responsible for these instability properties are also the reason for the emergence of saddle points and spurious local minima.
arXiv Detail & Related papers (2020-11-06T11:34:59Z) - Fine-Grained Analysis of Stability and Generalization for Stochastic
Gradient Descent [55.85456985750134]
We introduce a new stability measure called on-average model stability, for which we develop novel bounds controlled by the risks of SGD iterates.
This yields generalization bounds depending on the behavior of the best model, and leads to the first-ever-known fast bounds in the low-noise setting.
To our best knowledge, this gives the firstever-known stability and generalization for SGD with even non-differentiable loss functions.
arXiv Detail & Related papers (2020-06-15T06:30:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.