A supplemental investigation of non-linearity in quantum generative models with respect to simulatability and optimization
- URL: http://arxiv.org/abs/2302.00788v2
- Date: Mon, 29 Apr 2024 15:55:19 GMT
- Title: A supplemental investigation of non-linearity in quantum generative models with respect to simulatability and optimization
- Authors: Kaitlin Gili, Rohan S. Kumar, Mykolas Sveistrys, C. J. Ballance,
- Abstract summary: We investigate two questions of relevance to the quantum algorithms and machine learning communities.
Does introducing this form of non-linearity make the learning model classically simulatable due to the deferred measurement principle?
And does introducing this form of non-linearity make the overall model's training more unstable?
- Score: 1.4329826483358277
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent work has demonstrated the utility of introducing non-linearity through repeat-until-success (RUS) sub-routines into quantum circuits for generative modeling. As a follow-up to this work, we investigate two questions of relevance to the quantum algorithms and machine learning communities: Does introducing this form of non-linearity make the learning model classically simulatable due to the deferred measurement principle? And does introducing this form of non-linearity make the overall model's training more unstable? With respect to the first question, we demonstrate that the RUS sub-routines do not allow us to trivially map this quantum model to a classical one, whereas a model without RUS sub-circuits containing mid-circuit measurements could be mapped to a classical Bayesian network due to the deferred measurement principle of quantum mechanics. This strongly suggests that the proposed form of non-linearity makes the model classically in-efficient to simulate. In the pursuit of the second question, we train larger models than previously shown on three different probability distributions, one continuous and two discrete, and compare the training performance across multiple random trials. We see that while the model is able to perform exceptionally well in some trials, the variance across trials with certain datasets quantifies its relatively poor training stability.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - A Metalearned Neural Circuit for Nonparametric Bayesian Inference [4.767884267554628]
Most applications of machine learning to classification assume a closed set of balanced classes.
This is at odds with the real world, where class occurrence statistics often follow a long-tailed power-law distribution.
We present a method for extracting the inductive bias from a nonparametric Bayesian model and transferring it to an artificial neural network.
arXiv Detail & Related papers (2023-11-24T16:43:17Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Work statistics, quantum signatures and enhanced work extraction in
quadratic fermionic models [62.997667081978825]
In quadratic fermionic models we determine a quantum correction to the work statistics after a sudden and a time-dependent driving.
Such a correction lies in the non-commutativity of the initial quantum state and the time-dependent Hamiltonian.
Thanks to the latter, one can assess the onset of non-classical signatures in the KDQ distribution of work.
arXiv Detail & Related papers (2023-02-27T13:42:40Z) - Introducing Non-Linearity into Quantum Generative Models [0.0]
We introduce a model that adds non-linear activations via a neural network structure onto the standard Born Machine framework.
We compare our non-linear QNBM to the linear Quantum Circuit Born Machine.
We show that while both models can easily learn a trivial uniform probability distribution, the QNBM achieves an almost 3x smaller error rate than a QCBM.
arXiv Detail & Related papers (2022-05-28T18:59:49Z) - Learning Reduced Nonlinear State-Space Models: an Output-Error Based
Canonical Approach [8.029702645528412]
We investigate the effectiveness of deep learning in the modeling of dynamic systems with nonlinear behavior.
We show its ability to identify three different nonlinear systems.
The performances are evaluated in terms of open-loop prediction on test data generated in simulation as well as a real world data-set of unmanned aerial vehicle flight measurements.
arXiv Detail & Related papers (2022-04-19T06:33:23Z) - Learning continuous models for continuous physics [94.42705784823997]
We develop a test based on numerical analysis theory to validate machine learning models for science and engineering applications.
Our results illustrate how principled numerical analysis methods can be coupled with existing ML training/testing methodologies to validate models for science and engineering applications.
arXiv Detail & Related papers (2022-02-17T07:56:46Z) - Compositional Modeling of Nonlinear Dynamical Systems with ODE-based
Random Features [0.0]
We present a novel, domain-agnostic approach to tackling this problem.
We use compositions of physics-informed random features, derived from ordinary differential equations.
We find that our approach achieves comparable performance to a number of other probabilistic models on benchmark regression tasks.
arXiv Detail & Related papers (2021-06-10T17:55:13Z) - Hessian Eigenspectra of More Realistic Nonlinear Models [73.31363313577941]
We make a emphprecise characterization of the Hessian eigenspectra for a broad family of nonlinear models.
Our analysis takes a step forward to identify the origin of many striking features observed in more complex machine learning models.
arXiv Detail & Related papers (2021-03-02T06:59:52Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.