Score-based generative diffusion with "active" correlated noise sources
- URL: http://arxiv.org/abs/2411.07233v1
- Date: Mon, 11 Nov 2024 18:51:08 GMT
- Title: Score-based generative diffusion with "active" correlated noise sources
- Authors: Alexandra Lamtyugina, Agnish Kumar Behera, Aditya Nandy, Carlos Floyd, Suriyanarayanan Vaikuntanathan,
- Abstract summary: Diffusion models exhibit robust generative properties by approximating the underlying distribution of a dataset.
In this work, we explore how the generative performance may be modulated if noise sources with temporal correlations are used for the destruction of the data.
- Score: 38.32962448223814
- License:
- Abstract: Diffusion models exhibit robust generative properties by approximating the underlying distribution of a dataset and synthesizing data by sampling from the approximated distribution. In this work, we explore how the generative performance may be be modulated if noise sources with temporal correlations -- akin to those used in the field of active matter -- are used for the destruction of the data in the forward process. Our numerical and analytical experiments suggest that the corresponding reverse process may exhibit improved generative properties.
Related papers
- On the Relation Between Linear Diffusion and Power Iteration [42.158089783398616]
We study the generation process as a correlation machine''
We show that low frequencies emerge earlier in the generation process, where the denoising basis vectors are more aligned to the true data with a rate depending on their eigenvalues.
This model allows us to show that the linear diffusion model converges in mean to the leading eigenvector of the underlying data, similarly to the prevalent power iteration method.
arXiv Detail & Related papers (2024-10-16T07:33:12Z) - DiffATR: Diffusion-based Generative Modeling for Audio-Text Retrieval [49.076590578101985]
We present a diffusion-based ATR framework (DiffATR) that generates joint distribution from noise.
Experiments on the AudioCaps and Clotho datasets with superior performances, verify the effectiveness of our approach.
arXiv Detail & Related papers (2024-09-16T06:33:26Z) - Likelihood-based Out-of-Distribution Detection with Denoising Diffusion
Probabilistic Models [6.554019613111897]
We show that likelihood-based Out-of-Distribution detection can be extended to diffusion models.
We propose a new likelihood ratio for Out-of-Distribution detection with Deep Denoising Diffusion Models.
arXiv Detail & Related papers (2023-10-26T14:40:30Z) - Generating observation guided ensembles for data assimilation with
denoising diffusion probabilistic model [0.0]
This paper presents an ensemble data assimilation method using the pseudo ensembles generated by denoising diffusion probabilistic model.
Thanks to the variance in generated ensembles, our proposed method displays better performance than the well-established ensemble data assimilation method when the simulation model is imperfect.
arXiv Detail & Related papers (2023-08-13T07:55:46Z) - Improving Out-of-Distribution Robustness of Classifiers via Generative
Interpolation [56.620403243640396]
Deep neural networks achieve superior performance for learning from independent and identically distributed (i.i.d.) data.
However, their performance deteriorates significantly when handling out-of-distribution (OoD) data.
We develop a simple yet effective method called Generative Interpolation to fuse generative models trained from multiple domains for synthesizing diverse OoD samples.
arXiv Detail & Related papers (2023-07-23T03:53:53Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - From Denoising Diffusions to Denoising Markov Models [38.33676858989955]
Denoising diffusions are state-of-the-art generative models exhibiting remarkable empirical performance.
We propose a unifying framework generalising this approach to a wide class of spaces and leading to an original extension of score matching.
arXiv Detail & Related papers (2022-11-07T14:34:27Z) - Truncated Diffusion Probabilistic Models and Diffusion-based Adversarial
Auto-Encoders [137.1060633388405]
Diffusion-based generative models learn how to generate the data by inferring a reverse diffusion chain.
We propose a faster and cheaper approach that adds noise not until the data become pure random noise.
We show that the proposed model can be cast as an adversarial auto-encoder empowered by both the diffusion process and a learnable implicit prior.
arXiv Detail & Related papers (2022-02-19T20:18:49Z) - Inference and De-Noising of Non-Gaussian Particle Distribution
Functions: A Generative Modeling Approach [0.0]
Inference on data produced by numerical simulations generally consists of binning the data to recover the particle distribution function.
Here we demonstrate the use of normalizing flows to learn a smooth, tractable approximation to the noisy particle distribution function.
arXiv Detail & Related papers (2021-10-05T16:38:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.