Risk-Sensitive Diffusion for Perturbation-Robust Optimization
- URL: http://arxiv.org/abs/2402.02081v2
- Date: Fri, 5 Apr 2024 10:19:43 GMT
- Title: Risk-Sensitive Diffusion for Perturbation-Robust Optimization
- Authors: Yangming Li, Max Ruiz Luyten, Mihaela van der Schaar,
- Abstract summary: We show that noisy samples incur another objective function, rather than the one with score function, which will wrongly optimize the model.
We introduce risk-sensitive SDE, a type of differential equation (SDE) parameterized by the risk vector.
We prove that zero instability measure is only achievable in the case where noisy samples are caused by Gaussian perturbation.
- Score: 58.68233326265417
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The essence of score-based generative models (SGM) is to optimize a score-based model towards the score function. However, we show that noisy samples incur another objective function, rather than the one with score function, which will wrongly optimize the model. To address this problem, we first consider a new setting where every noisy sample is paired with a risk vector, indicating the data quality (e.g., noise level). This setting is very common in real-world applications, especially for medical and sensor data. Then, we introduce risk-sensitive SDE, a type of stochastic differential equation (SDE) parameterized by the risk vector. With this tool, we aim to minimize a measure called perturbation instability, which we define to quantify the negative impact of noisy samples on optimization. We will prove that zero instability measure is only achievable in the case where noisy samples are caused by Gaussian perturbation. For non-Gaussian cases, we will also provide its optimal coefficients that minimize the misguidance of noisy samples. To apply risk-sensitive SDE in practice, we extend widely used diffusion models to their risk-sensitive versions and derive a risk-free loss that is efficient for computation. We also have conducted numerical experiments to confirm the validity of our theorems and show that they let SGM be robust to noisy samples for optimization.
Related papers
- Language-Informed Hyperspectral Image Synthesis for Imbalanced-Small Sample Classification via Semi-Supervised Conditional Diffusion Model [1.9746060146273674]
This paper proposes Txt2HSI-LDM(VAE), a novel language-informed hyperspectral image synthesis method.
To address the high-dimensionality of hyperspectral data, a universal variational autoencoder (VAE) is designed to map the data into a low-dimensional latent space.
VAE decodes HSI from latent space generated by the diffusion model with the language conditions as input.
arXiv Detail & Related papers (2025-02-27T02:35:49Z) - Identifying Drift, Diffusion, and Causal Structure from Temporal Snapshots [10.018568337210876]
We present the first comprehensive approach for jointly estimating the drift and diffusion of an SDE from its temporal marginals.
We show that each of these steps areAlterally optimal with respect to the Kullback-Leibler datasets.
arXiv Detail & Related papers (2024-10-30T06:28:21Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - SA-Solver: Stochastic Adams Solver for Fast Sampling of Diffusion Models [63.49229402384349]
Diffusion Probabilistic Models (DPMs) have achieved considerable success in generation tasks.<n>As sampling from DPMs is equivalent to solving diffusion SDE or ODE which is time-consuming, numerous fast sampling methods built upon improved differential equation solvers are proposed.<n>We propose textitSA-r, which is an improved efficient method for solving SDE to generate data with high quality.
arXiv Detail & Related papers (2023-09-10T12:44:54Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - Exploring the Optimal Choice for Generative Processes in Diffusion
Models: Ordinary vs Stochastic Differential Equations [6.2284442126065525]
We study the problem mathematically for two limiting scenarios: the zero diffusion (ODE) case and the large diffusion case.
Our findings indicate that when the perturbation occurs at the end of the generative process, the ODE model outperforms the SDE model with a large diffusion coefficient.
arXiv Detail & Related papers (2023-06-03T09:27:15Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Solving Diffusion ODEs with Optimal Boundary Conditions for Better Image Super-Resolution [82.50210340928173]
randomness of diffusion models results in ineffectiveness and instability, making it challenging for users to guarantee the quality of SR results.
We propose a plug-and-play sampling method that owns the potential to benefit a series of diffusion-based SR methods.
The quality of SR results sampled by the proposed method with fewer steps outperforms the quality of results sampled by current methods with randomness from the same pre-trained diffusion-based SR model.
arXiv Detail & Related papers (2023-05-24T17:09:54Z) - Diffusion Normalizing Flow [4.94950858749529]
We present a novel generative modeling method called diffusion normalizing flow based on differential equations (SDEs)
The algorithm consists of two neural SDEs: a forward SDE that gradually adds noise to the data to transform the data into Gaussian random noise, and a backward SDE that gradually removes the noise to sample from the data distribution.
Our algorithm demonstrates competitive performance in both high-dimension data density estimation and image generation tasks.
arXiv Detail & Related papers (2021-10-14T17:41:12Z) - Score-Based Generative Modeling through Stochastic Differential
Equations [114.39209003111723]
We present a differential equation that transforms a complex data distribution to a known prior distribution by injecting noise.
A corresponding reverse-time SDE transforms the prior distribution back into the data distribution by slowly removing the noise.
By leveraging advances in score-based generative modeling, we can accurately estimate these scores with neural networks.
We demonstrate high fidelity generation of 1024 x 1024 images for the first time from a score-based generative model.
arXiv Detail & Related papers (2020-11-26T19:39:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.