Variational inference with a quantum computer
- URL: http://arxiv.org/abs/2103.06720v1
- Date: Thu, 11 Mar 2021 15:12:21 GMT
- Title: Variational inference with a quantum computer
- Authors: Marcello Benedetti, Brian Coyle, Mattia Fiorentini, Michael Lubasch,
Matthias Rosenkranz
- Abstract summary: Inference is the task of drawing conclusions about unobserved variables given observations of related variables.
One alternative is variational inference, where a candidate probability distribution is optimized to approximate the posterior distribution over unobserved variables.
In this work, we propose quantum Born machines as variational distributions over discrete variables.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Inference is the task of drawing conclusions about unobserved variables given
observations of related variables. Applications range from identifying diseases
from symptoms to classifying economic regimes from price movements.
Unfortunately, performing exact inference is intractable in general. One
alternative is variational inference, where a candidate probability
distribution is optimized to approximate the posterior distribution over
unobserved variables. For good approximations a flexible and highly expressive
candidate distribution is desirable. In this work, we propose quantum Born
machines as variational distributions over discrete variables. We apply the
framework of operator variational inference to achieve this goal. In
particular, we adopt two specific realizations: one with an adversarial
objective and one based on the kernelized Stein discrepancy. We demonstrate the
approach numerically using examples of Bayesian networks, and implement an
experiment on an IBM quantum computer. Our techniques enable efficient
variational inference with distributions beyond those that are efficiently
representable on a classical computer.
Related papers
- SoftCVI: Contrastive variational inference with self-generated soft labels [2.5398014196797614]
Variational inference and Markov chain Monte Carlo methods are the predominant tools for this task.
We introduce Soft Contrastive Variational Inference (SoftCVI), which allows a family of variational objectives to be derived through a contrastive estimation framework.
We find that SoftCVI can be used to form objectives which are stable to train and mass-covering, frequently outperforming inference with other variational approaches.
arXiv Detail & Related papers (2024-07-22T14:54:12Z) - DistPred: A Distribution-Free Probabilistic Inference Method for Regression and Forecasting [14.390842560217743]
We propose a novel approach called DistPred for regression and forecasting tasks.
We transform proper scoring rules that measure the discrepancy between the predicted distribution and the target distribution into a differentiable discrete form.
This allows the model to sample numerous samples in a single forward pass to estimate the potential distribution of the response variable.
arXiv Detail & Related papers (2024-06-17T10:33:00Z) - Proxy Methods for Domain Adaptation [78.03254010884783]
proxy variables allow for adaptation to distribution shift without explicitly recovering or modeling latent variables.
We develop a two-stage kernel estimation approach to adapt to complex distribution shifts in both settings.
arXiv Detail & Related papers (2024-03-12T09:32:41Z) - Equivariance Discovery by Learned Parameter-Sharing [153.41877129746223]
We study how to discover interpretable equivariances from data.
Specifically, we formulate this discovery process as an optimization problem over a model's parameter-sharing schemes.
Also, we theoretically analyze the method for Gaussian data and provide a bound on the mean squared gap between the studied discovery scheme and the oracle scheme.
arXiv Detail & Related papers (2022-04-07T17:59:19Z) - Generative Quantum Learning of Joint Probability Distribution Functions [1.221966660783828]
We design quantum machine learning algorithms to model copulas.
We show that any copula can be naturally mapped to a multipartite maximally entangled state.
A variational ansatz we christen as a qopula' creates arbitrary correlations between variables.
arXiv Detail & Related papers (2021-09-13T20:50:15Z) - Predicting with Confidence on Unseen Distributions [90.68414180153897]
We connect domain adaptation and predictive uncertainty literature to predict model accuracy on challenging unseen distributions.
We find that the difference of confidences (DoC) of a classifier's predictions successfully estimates the classifier's performance change over a variety of shifts.
We specifically investigate the distinction between synthetic and natural distribution shifts and observe that despite its simplicity DoC consistently outperforms other quantifications of distributional difference.
arXiv Detail & Related papers (2021-07-07T15:50:18Z) - Modeling Sequences as Distributions with Uncertainty for Sequential
Recommendation [63.77513071533095]
Most existing sequential methods assume users are deterministic.
Item-item transitions might fluctuate significantly in several item aspects and exhibit randomness of user interests.
We propose a Distribution-based Transformer Sequential Recommendation (DT4SR) which injects uncertainties into sequential modeling.
arXiv Detail & Related papers (2021-06-11T04:35:21Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - An Embedded Model Estimator for Non-Stationary Random Functions using
Multiple Secondary Variables [0.0]
This paper introduces the method and shows that it has consistency results that are similar in nature to those applying to geostatistical modelling and to Quantile Random Forests.
The algorithm works by estimating a conditional distribution for the target variable at each target location.
arXiv Detail & Related papers (2020-11-09T00:14:24Z) - Variational Inference as Iterative Projection in a Bayesian Hilbert
Space with Application to Robotic State Estimation [14.670851095242451]
Variational Bayesian inference is an important machine-learning tool that finds application from statistics to robotics.
We show that variational inference based on KL divergence can amount to iterative projection, in the Euclidean sense, of the Bayesian posterior onto a subspace corresponding to the selected approximation family.
arXiv Detail & Related papers (2020-05-14T21:33:31Z) - Decision-Making with Auto-Encoding Variational Bayes [71.44735417472043]
We show that a posterior approximation distinct from the variational distribution should be used for making decisions.
Motivated by these theoretical results, we propose learning several approximate proposals for the best model.
In addition to toy examples, we present a full-fledged case study of single-cell RNA sequencing.
arXiv Detail & Related papers (2020-02-17T19:23:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.