Staying on the Manifold: Geometry-Aware Noise Injection
- URL: http://arxiv.org/abs/2509.20201v1
- Date: Wed, 24 Sep 2025 14:58:38 GMT
- Title: Staying on the Manifold: Geometry-Aware Noise Injection
- Authors: Albert Kjøller Jacobsen, Johanna Marie Gegenfurtner, Georgios Arvanitidis,
- Abstract summary: It has been shown that perturbing the input during training implicitly regularises the gradient of the learnt function.<n>Previous research mostly considered the addition of ambient noise in the input space, without considering the underlying structure of the data.<n>We propose several methods of adding geometry-aware input noise that accounts for the lower dimensional manifold the input space inhabits.
- Score: 3.897275210728671
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: It has been shown that perturbing the input during training implicitly regularises the gradient of the learnt function, leading to smoother models and enhancing generalisation. However, previous research mostly considered the addition of ambient noise in the input space, without considering the underlying structure of the data. In this work, we propose several methods of adding geometry-aware input noise that accounts for the lower dimensional manifold the input space inhabits. We start by projecting ambient Gaussian noise onto the tangent space of the manifold. In a second step, the noise sample is mapped on the manifold via the associated geodesic curve. We also consider Brownian motion noise, which moves in random steps along the manifold. We show that geometry-aware noise leads to improved generalization and robustness to hyperparameter selection on highly curved manifolds, while performing at least as well as training without noise on simpler manifolds. Our proposed framework extends to learned data manifolds.
Related papers
- Mitigating the Noise Shift for Denoising Generative Models via Noise Awareness Guidance [54.88271057438763]
Noise Awareness Guidance (NAG) is a correction method that explicitly steers sampling trajectories to remain consistent with the pre-defined noise schedule.<n>NAG consistently mitigates noise shift and substantially improves the generation quality of mainstream diffusion models.
arXiv Detail & Related papers (2025-10-14T13:31:34Z) - Robust Tangent Space Estimation via Laplacian Eigenvector Gradient Orthogonalization [48.25304391127552]
Estimating the tangent spaces of a data manifold is a fundamental problem in data analysis.<n>We propose a method, Laplacian Eigenvector Gradient Orthogonalization (LEGO), that utilizes the global structure of the data to guide local tangent space estimation.
arXiv Detail & Related papers (2025-10-02T17:59:45Z) - Spacetime Geometry of Denoising in Diffusion Models [20.644091294762678]
We present a novel perspective on diffusion models using the framework of information geometry.<n>We show that the set of noisy samples, taken across all noise levels simultaneously, forms a statistical manifold.<n>We demonstrate the practical value of this geometric viewpoint in transition path sampling.
arXiv Detail & Related papers (2025-05-23T06:16:58Z) - FreSca: Scaling in Frequency Space Enhances Diffusion Models [55.75504192166779]
This paper explores frequency-based control within latent diffusion models.<n>We introduce FreSca, a novel framework that decomposes noise difference into low- and high-frequency components.<n>FreSca operates without any model retraining or architectural change, offering model- and task-agnostic control.
arXiv Detail & Related papers (2025-04-02T22:03:11Z) - Impact of Noisy Supervision in Foundation Model Learning [91.56591923244943]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.<n>We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Latent Class-Conditional Noise Model [54.56899309997246]
We introduce a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework.
We then deduce a dynamic label regression method for LCCN, whose Gibbs sampler allows us efficiently infer the latent true labels.
Our approach safeguards the stable update of the noise transition, which avoids previous arbitrarily tuning from a mini-batch of samples.
arXiv Detail & Related papers (2023-02-19T15:24:37Z) - Diffusion Posterior Sampling for General Noisy Inverse Problems [50.873313752797124]
We extend diffusion solvers to handle noisy (non)linear inverse problems via approximation of the posterior sampling.
Our method demonstrates that diffusion models can incorporate various measurement noise statistics.
arXiv Detail & Related papers (2022-09-29T11:12:27Z) - On the Theoretical Properties of Noise Correlation in Stochastic
Optimization [6.970991851511823]
We show that fPGD possesses exploration abilities favorable over PGD and Anti-PGD.
These results open the field to novel ways to exploit noise for machine learning models.
arXiv Detail & Related papers (2022-09-19T16:32:22Z) - Multiview point cloud registration with anisotropic and space-varying
localization noise [1.5499426028105903]
We address the problem of registering multiple point clouds corrupted with high anisotropic localization noise.
Existing methods are based on an implicit assumption of space-invariant isotropic noise.
We show that our noise handling strategy improves significantly the robustness to high levels of anisotropic noise.
arXiv Detail & Related papers (2022-01-03T15:21:24Z) - LAAT: Locally Aligned Ant Technique for discovering multiple faint low
dimensional structures of varying density [0.0]
In manifold learning, several studies indicate solutions for removing background noise or noise close to the structure when the density is substantially higher than that exhibited by the noise.
We propose a novel method to extract manifold points in the presence of noise based on the idea of Ant colony optimization.
In contrast to the existing random walk solutions, our technique captures points that are locally aligned with major directions of the manifold.
arXiv Detail & Related papers (2020-09-17T14:22:50Z) - Shape Matters: Understanding the Implicit Bias of the Noise Covariance [76.54300276636982]
Noise in gradient descent provides a crucial implicit regularization effect for training over parameterized models.
We show that parameter-dependent noise -- induced by mini-batches or label perturbation -- is far more effective than Gaussian noise.
Our analysis reveals that parameter-dependent noise introduces a bias towards local minima with smaller noise variance, whereas spherical Gaussian noise does not.
arXiv Detail & Related papers (2020-06-15T18:31:02Z) - Manifold Fitting under Unbounded Noise [4.54773250519101]
We introduce a new manifold-fitting method, by which the output manifold is constructed by directly estimating the tangent spaces at the projected points on the underlying manifold.
Our new method provides theoretical convergence in high probability, in terms of the upper bound of the distance between the estimated and underlying manifold.
arXiv Detail & Related papers (2019-09-23T08:55:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.