Toward using GANs in astrophysical Monte-Carlo simulations
- URL: http://arxiv.org/abs/2402.12396v1
- Date: Fri, 16 Feb 2024 23:07:53 GMT
- Title: Toward using GANs in astrophysical Monte-Carlo simulations
- Authors: Ahab Isaac, Wesley Armour, Karel Ad\'amek
- Abstract summary: We show that generative adversarial network (GAN) is capable of statistically replicating the Maxwell-J"uttner distribution.
The average value of the Kolmogorov-Smirnov test is 0.5 for samples generated by the neural network, showing that the generated distribution cannot be distinguished from the true distribution.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurate modelling of spectra produced by X-ray sources requires the use of
Monte-Carlo simulations. These simulations need to evaluate physical processes,
such as those occurring in accretion processes around compact objects by
sampling a number of different probability distributions. This is
computationally time-consuming and could be sped up if replaced by neural
networks. We demonstrate, on an example of the Maxwell-J\"uttner distribution
that describes the speed of relativistic electrons, that the generative
adversarial network (GAN) is capable of statistically replicating the
distribution. The average value of the Kolmogorov-Smirnov test is 0.5 for
samples generated by the neural network, showing that the generated
distribution cannot be distinguished from the true distribution.
Related papers
- Generative Conditional Distributions by Neural (Entropic) Optimal Transport [12.152228552335798]
We introduce a novel neural entropic optimal transport method designed to learn generative models of conditional distributions.
Our method relies on the minimax training of two neural networks.
Our experiments on real-world datasets show the effectiveness of our algorithm compared to state-of-the-art conditional distribution learning techniques.
arXiv Detail & Related papers (2024-06-04T13:45:35Z) - An Efficient Quasi-Random Sampling for Copulas [3.400056739248712]
This paper proposes the use of generative models, such as Generative Adrial Networks (GANs), to generate quasi-random samples for any copula.
GANs are a type of implicit generative models used to learn the distribution of complex data, thus facilitating easy sampling.
arXiv Detail & Related papers (2024-03-08T13:01:09Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Expressive probabilistic sampling in recurrent neural networks [4.3900330990701235]
We show that firing rate dynamics of a recurrent neural circuit with a separate set of output units can sample from an arbitrary probability distribution.
We propose an efficient training procedure based on denoising score matching that finds recurrent and output weights such that the RSN implements Langevin sampling.
arXiv Detail & Related papers (2023-08-22T22:20:39Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - Unsupervised Learning of Sampling Distributions for Particle Filters [80.6716888175925]
We put forward four methods for learning sampling distributions from observed measurements.
Experiments demonstrate that learned sampling distributions exhibit better performance than designed, minimum-degeneracy sampling distributions.
arXiv Detail & Related papers (2023-02-02T15:50:21Z) - An Energy-Based Prior for Generative Saliency [62.79775297611203]
We propose a novel generative saliency prediction framework that adopts an informative energy-based model as a prior distribution.
With the generative saliency model, we can obtain a pixel-wise uncertainty map from an image, indicating model confidence in the saliency prediction.
Experimental results show that our generative saliency model with an energy-based prior can achieve not only accurate saliency predictions but also reliable uncertainty maps consistent with human perception.
arXiv Detail & Related papers (2022-04-19T10:51:00Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Unrolling Particles: Unsupervised Learning of Sampling Distributions [102.72972137287728]
Particle filtering is used to compute good nonlinear estimates of complex systems.
We show in simulations that the resulting particle filter yields good estimates in a wide range of scenarios.
arXiv Detail & Related papers (2021-10-06T16:58:34Z) - Generative Neural Samplers for the Quantum Heisenberg Chain [0.3655021726150368]
Generative neural samplers offer a complementary approach to Monte Carlo methods for problems in statistical physics and quantum field theory.
This work tests the ability of generative neural samplers to estimate observables for real-world low-dimensional spin systems.
arXiv Detail & Related papers (2020-12-18T14:28:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.