Multiscale Flow for Robust and Optimal Cosmological Analysis
- URL: http://arxiv.org/abs/2306.04689v2
- Date: Thu, 15 Feb 2024 00:55:03 GMT
- Title: Multiscale Flow for Robust and Optimal Cosmological Analysis
- Authors: Biwei Dai and Uros Seljak
- Abstract summary: Multiscale Flow is a generative Normalizing Flow that creates samples and models the field-level likelihood of two-dimensional cosmological data.
We show that Multiscale Flow is able to identify distribution shifts not in the training data such as baryonic effects.
- Score: 7.977229957867868
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose Multiscale Flow, a generative Normalizing Flow that creates
samples and models the field-level likelihood of two-dimensional cosmological
data such as weak lensing. Multiscale Flow uses hierarchical decomposition of
cosmological fields via a wavelet basis, and then models different wavelet
components separately as Normalizing Flows. The log-likelihood of the original
cosmological field can be recovered by summing over the log-likelihood of each
wavelet term. This decomposition allows us to separate the information from
different scales and identify distribution shifts in the data such as unknown
scale-dependent systematics. The resulting likelihood analysis can not only
identify these types of systematics, but can also be made optimal, in the sense
that the Multiscale Flow can learn the full likelihood at the field without any
dimensionality reduction. We apply Multiscale Flow to weak lensing mock
datasets for cosmological inference, and show that it significantly outperforms
traditional summary statistics such as power spectrum and peak counts, as well
as novel Machine Learning based summary statistics such as scattering transform
and convolutional neural networks. We further show that Multiscale Flow is able
to identify distribution shifts not in the training data such as baryonic
effects. Finally, we demonstrate that Multiscale Flow can be used to generate
realistic samples of weak lensing data.
Related papers
- Unsupervised Representation Learning from Sparse Transformation Analysis [79.94858534887801]
We propose to learn representations from sequence data by factorizing the transformations of the latent variables into sparse components.
Input data are first encoded as distributions of latent activations and subsequently transformed using a probability flow model.
arXiv Detail & Related papers (2024-10-07T23:53:25Z) - A Phase Transition in Diffusion Models Reveals the Hierarchical Nature
of Data [55.748186000425996]
Recent advancements show that diffusion models can generate high-quality images.
We study this phenomenon in a hierarchical generative model of data.
Our analysis characterises the relationship between time and scale in diffusion models.
arXiv Detail & Related papers (2024-02-26T19:52:33Z) - Unpaired Downscaling of Fluid Flows with Diffusion Bridges [0.0]
We show how one can chain together two independent conditional diffusion models for use in domain translation.
The resulting transformation is a diffusion bridge between a low resolution and a high resolution dataset.
We demonstrate that the method enhances resolution and corrects context-dependent biases in geophysical fluid simulations.
arXiv Detail & Related papers (2023-05-02T23:13:44Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Translation and Rotation Equivariant Normalizing Flow (TRENF) for
Optimal Cosmological Analysis [7.6146285961466]
Our universe is homogeneous and isotropic, and its perturbations obey translation and rotation symmetry.
We develop a generative Normalizing Flow model which explicitly incorporates these symmetries.
TRENF gives direct access to the high dimensional data likelihood p(x|y) as a function of the labels y.
arXiv Detail & Related papers (2022-02-10T19:00:03Z) - A graph representation based on fluid diffusion model for multimodal
data analysis: theoretical aspects and enhanced community detection [14.601444144225875]
We introduce a novel model for graph definition based on fluid diffusion.
Our method is able to strongly outperform state-of-the-art schemes for community detection in multimodal data analysis.
arXiv Detail & Related papers (2021-12-07T16:30:03Z) - Discrete Denoising Flows [87.44537620217673]
We introduce a new discrete flow-based model for categorical random variables: Discrete Denoising Flows (DDFs)
In contrast with other discrete flow-based models, our model can be locally trained without introducing gradient bias.
We show that DDFs outperform Discrete Flows on modeling a toy example, binary MNIST and Cityscapes segmentation maps, measured in log-likelihood.
arXiv Detail & Related papers (2021-07-24T14:47:22Z) - Flows for simultaneous manifold learning and density estimation [12.451050883955071]
manifold-learning flows (M-flows) represent datasets with a manifold structure more faithfully.
M-flows learn the data manifold and allow for better inference than standard flows in the ambient data space.
arXiv Detail & Related papers (2020-03-31T02:07:48Z) - Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow [16.41460104376002]
We introduce subset flows, a class of flows that can transform finite volumes and allow exact computation of likelihoods for discrete data.
We identify ordinal discrete autoregressive models, including WaveNets, PixelCNNs and Transformers, as single-layer flows.
We demonstrate state-of-the-art results on CIFAR-10 for flow models trained with dequantization.
arXiv Detail & Related papers (2020-02-06T22:58:51Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.