An Efficient Quasi-Random Sampling for Copulas
- URL: http://arxiv.org/abs/2403.05281v1
- Date: Fri, 8 Mar 2024 13:01:09 GMT
- Title: An Efficient Quasi-Random Sampling for Copulas
- Authors: Sumin Wang, Chenxian Huang, Yongdao Zhou and Min-Qian Liu
- Abstract summary: This paper proposes the use of generative models, such as Generative Adrial Networks (GANs), to generate quasi-random samples for any copula.
GANs are a type of implicit generative models used to learn the distribution of complex data, thus facilitating easy sampling.
- Score: 3.400056739248712
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This paper examines an efficient method for quasi-random sampling of copulas
in Monte Carlo computations. Traditional methods, like conditional distribution
methods (CDM), have limitations when dealing with high-dimensional or implicit
copulas, which refer to those that cannot be accurately represented by existing
parametric copulas. Instead, this paper proposes the use of generative models,
such as Generative Adversarial Networks (GANs), to generate quasi-random
samples for any copula. GANs are a type of implicit generative models used to
learn the distribution of complex data, thus facilitating easy sampling. In our
study, GANs are employed to learn the mapping from a uniform distribution to
copulas. Once this mapping is learned, obtaining quasi-random samples from the
copula only requires inputting quasi-random samples from the uniform
distribution. This approach offers a more flexible method for any copula.
Additionally, we provide theoretical analysis of quasi-Monte Carlo estimators
based on quasi-random samples of copulas. Through simulated and practical
applications, particularly in the field of risk management, we validate the
proposed method and demonstrate its superiority over various existing methods.
Related papers
- Conditional sampling within generative diffusion models [12.608803080528142]
We present a review of existing computational approaches to conditional sampling within generative diffusion models.
We highlight key methodologies that either utilise the joint distribution, or rely on (pre-trained) marginal distributions with explicit likelihoods.
arXiv Detail & Related papers (2024-09-15T07:48:40Z) - Gaussian Processes Sampling with Sparse Grids under Additive Schwarz Preconditioner [6.408773096179187]
We propose a scalable algorithm for sampling random realizations of the prior and posterior of GP models.
The proposed algorithm leverages inducing points approximation with sparse grids, as well as additive Schwarz preconditioners.
arXiv Detail & Related papers (2024-08-01T00:19:36Z) - Statistical Mechanics Calculations Using Variational Autoregressive Networks and Quantum Annealing [0.552480439325792]
An approximation method using a variational autoregressive network (VAN) has been proposed recently.
The present study introduces a novel approximation method that employs samples derived from quantum annealing machines in conjunction with VAN.
When applied to the finite-size Sherrington-Kirkpatrick model, the proposed method demonstrates enhanced accuracy compared to the traditional VAN approach.
arXiv Detail & Related papers (2024-04-30T05:41:49Z) - Simple and effective data augmentation for compositional generalization [64.00420578048855]
We show that data augmentation methods that sample MRs and backtranslate them can be effective for compositional generalization.
Remarkably, sampling from a uniform distribution performs almost as well as sampling from the test distribution.
arXiv Detail & Related papers (2024-01-18T09:13:59Z) - Structured Voronoi Sampling [61.629198273926676]
In this paper, we take an important step toward building a principled approach for sampling from language models with gradient-based methods.
We name our gradient-based technique Structured Voronoi Sampling (SVS)
In a controlled generation task, SVS is able to generate fluent and diverse samples while following the control targets significantly better than other methods.
arXiv Detail & Related papers (2023-06-05T17:32:35Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - AdaPT-GMM: Powerful and robust covariate-assisted multiple testing [0.7614628596146599]
We propose a new empirical Bayes method for co-assisted multiple testing with false discovery rate (FDR) control.
Our method refines the adaptive p-value thresholding (AdaPT) procedure by generalizing its masking scheme.
We show in extensive simulations and real data examples that our new method, which we call AdaPT-GMM, consistently delivers high power.
arXiv Detail & Related papers (2021-06-30T05:06:18Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - GANs with Variational Entropy Regularizers: Applications in Mitigating
the Mode-Collapse Issue [95.23775347605923]
Building on the success of deep learning, Generative Adversarial Networks (GANs) provide a modern approach to learn a probability distribution from observed samples.
GANs often suffer from the mode collapse issue where the generator fails to capture all existing modes of the input distribution.
We take an information-theoretic approach and maximize a variational lower bound on the entropy of the generated samples to increase their diversity.
arXiv Detail & Related papers (2020-09-24T19:34:37Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.