Unsupervised learning of anomalous diffusion data
- URL: http://arxiv.org/abs/2108.03411v1
- Date: Sat, 7 Aug 2021 09:45:21 GMT
- Title: Unsupervised learning of anomalous diffusion data
- Authors: Gorka Mu\~noz-Gil, Guillem Guig\'o i Corominas, Maciej Lewenstein
- Abstract summary: We show that the main diffusion characteristics can be learnt without the need of labelling the data.
We also explore the feasibility of finding novel types of diffusion, in this case represented by compositions of existing diffusion models.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The characterization of diffusion processes is a keystone in our
understanding of a variety of physical phenomena. Many of these deviate from
Brownian motion, giving rise to anomalous diffusion. Various theoretical models
exists nowadays to describe such processes, but their application to
experimental setups is often challenging, due to the stochastic nature of the
phenomena and the difficulty to harness reliable data. The latter often
consists on short and noisy trajectories, which are hard to characterize with
usual statistical approaches. In recent years, we have witnessed an impressive
effort to bridge theory and experiments by means of supervised machine learning
techniques, with astonishing results. In this work, we explore the use of
unsupervised methods in anomalous diffusion data. We show that the main
diffusion characteristics can be learnt without the need of any labelling of
the data. We use such method to discriminate between anomalous diffusion models
and extract their physical parameters. Moreover, we explore the feasibility of
finding novel types of diffusion, in this case represented by compositions of
existing diffusion models. At last, we showcase the use of the method in
experimental data and demonstrate its advantages for cases where supervised
learning is not applicable.
Related papers
- Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Data Attribution for Diffusion Models: Timestep-induced Bias in Influence Estimation [53.27596811146316]
Diffusion models operate over a sequence of timesteps instead of instantaneous input-output relationships in previous contexts.
We present Diffusion-TracIn that incorporates this temporal dynamics and observe that samples' loss gradient norms are highly dependent on timestep.
We introduce Diffusion-ReTrac as a re-normalized adaptation that enables the retrieval of training samples more targeted to the test sample of interest.
arXiv Detail & Related papers (2024-01-17T07:58:18Z) - Fair Sampling in Diffusion Models through Switching Mechanism [5.560136885815622]
We propose a fairness-aware sampling method called textitattribute switching mechanism for diffusion models.
We mathematically prove and experimentally demonstrate the effectiveness of the proposed method on two key aspects.
arXiv Detail & Related papers (2024-01-06T06:55:26Z) - Guided Diffusion from Self-Supervised Diffusion Features [49.78673164423208]
Guidance serves as a key concept in diffusion models, yet its effectiveness is often limited by the need for extra data annotation or pretraining.
We propose a framework to extract guidance from, and specifically for, diffusion models.
arXiv Detail & Related papers (2023-12-14T11:19:11Z) - The Emergence of Reproducibility and Generalizability in Diffusion Models [10.188731323681575]
Given the same starting noise input and a deterministic sampler, different diffusion models often yield remarkably similar outputs.
We show that diffusion models are learning distinct distributions affected by the training data size.
This valuable property generalizes to many variants of diffusion models, including those for conditional use, solving inverse problems, and model fine-tuning.
arXiv Detail & Related papers (2023-10-08T19:02:46Z) - Soft Mixture Denoising: Beyond the Expressive Bottleneck of Diffusion
Models [76.46246743508651]
We show that current diffusion models actually have an expressive bottleneck in backward denoising.
We introduce soft mixture denoising (SMD), an expressive and efficient model for backward denoising.
arXiv Detail & Related papers (2023-09-25T12:03:32Z) - Gramian Angular Fields for leveraging pretrained computer vision models
with anomalous diffusion trajectories [0.9012198585960443]
We present a new data-driven method for working with diffusive trajectories.
This method utilizes Gramian Angular Fields (GAF) to encode one-dimensional trajectories as images.
We leverage two well-established pre-trained computer-vision models, ResNet and MobileNet, to characterize the underlying diffusive regime.
arXiv Detail & Related papers (2023-09-02T17:22:45Z) - Eliminating Lipschitz Singularities in Diffusion Models [51.806899946775076]
We show that diffusion models frequently exhibit the infinite Lipschitz near the zero point of timesteps.
This poses a threat to the stability and accuracy of the diffusion process, which relies on integral operations.
We propose a novel approach, dubbed E-TSDM, which eliminates the Lipschitz of the diffusion model near zero.
arXiv Detail & Related papers (2023-06-20T03:05:28Z) - Efficient recurrent neural network methods for anomalously diffusing
single particle short and noisy trajectories [0.08594140167290096]
We present a data-driven method able to infer the anomalous exponent and to identify the type of anomalous diffusion process behind single, noisy and short trajectories.
A combination of convolutional and recurrent neural networks were used to achieve state-of-the-art results.
arXiv Detail & Related papers (2021-08-05T20:04:37Z) - Extreme Learning Machine for the Characterization of Anomalous Diffusion
from Single Trajectories [0.0]
I describe a simple approach to tackle the tasks of the AnDi challenge by combining extreme learning machine and feature engineering (AnDi-ELM)
The method reaches satisfactory performance while offering a straightforward implementation and fast training time with limited computing resources.
arXiv Detail & Related papers (2021-05-06T11:56:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.