Linear Regression Using Quantum Annealing with Continuous Variables
- URL: http://arxiv.org/abs/2410.08569v1
- Date: Fri, 11 Oct 2024 06:49:09 GMT
- Title: Linear Regression Using Quantum Annealing with Continuous Variables
- Authors: Asuka Koura, Takashi Imoto, Katsuki Ura, Yuichiro Matsuzaki,
- Abstract summary: The boson system facilitates the optimization of linear regression without resorting to discrete approximations.
The major benefit of our new approach is that it can ensure accuracy without increasing the number of qubits as long as the adiabatic condition is satisfied.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Linear regression is a data analysis technique, which is categorized as supervised learning. By utilizing known data, we can predict unknown data. Recently, researchers have explored the use of quantum annealing (QA) to perform linear regression where parameters are approximated to discrete values using binary numbers. However, this approach has a limitation: we need to increase the number of qubits to improve the accuracy. Here, we propose a novel linear regression method using QA that leverages continuous variables. In particular, the boson system facilitates the optimization of linear regression without resorting to discrete approximations, as it directly manages continuous variables while engaging in QA. The major benefit of our new approach is that it can ensure accuracy without increasing the number of qubits as long as the adiabatic condition is satisfied.
Related papers
- Adaptive Learning for Quantum Linear Regression [10.445957451908695]
In a recent work, linear regression was formulated as a quadratic binary optimization problem.
This approach promises a computational time advantage for large datasets.
However, the quality of the solution is limited by the necessary use of a precision vector.
In this work, we focus on the practical challenge of improving the precision vector encoding.
arXiv Detail & Related papers (2024-08-05T21:09:01Z) - LFFR: Logistic Function For (single-output) Regression [0.0]
We implement privacy-preserving regression training using data encrypted under a fully homomorphic encryption scheme.
We develop a novel and efficient algorithm called LFFR for homomorphic regression using the logistic function.
arXiv Detail & Related papers (2024-07-13T17:33:49Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Anchor Data Augmentation [53.39044919864444]
We propose a novel algorithm for data augmentation in nonlinear over-parametrized regression.
Our data augmentation algorithm borrows from the literature on causality and extends the recently proposed Anchor regression (AR) method for data augmentation.
arXiv Detail & Related papers (2023-11-12T21:08:43Z) - Streaming Sparse Linear Regression [1.8707139489039097]
We propose a novel online sparse linear regression framework for analyzing streaming data when data points arrive sequentially.
Our proposed method is memory efficient and requires less stringent restricted strong convexity assumptions.
arXiv Detail & Related papers (2022-11-11T07:31:55Z) - Decoupling Shrinkage and Selection for the Bayesian Quantile Regression [0.0]
This paper extends the idea of decoupling shrinkage and sparsity for continuous priors to Bayesian Quantile Regression (BQR)
In the first step, we shrink the quantile regression posterior through state of the art continuous priors and in the second step, we sparsify the posterior through an efficient variant of the adaptive lasso.
Our procedure can be used to communicate to policymakers which variables drive downside risk to the macro economy.
arXiv Detail & Related papers (2021-07-18T17:22:33Z) - Regression Bugs Are In Your Model! Measuring, Reducing and Analyzing
Regressions In NLP Model Updates [68.09049111171862]
This work focuses on quantifying, reducing and analyzing regression errors in the NLP model updates.
We formulate the regression-free model updates into a constrained optimization problem.
We empirically analyze how model ensemble reduces regression.
arXiv Detail & Related papers (2021-05-07T03:33:00Z) - Scalable Marginal Likelihood Estimation for Model Selection in Deep
Learning [78.83598532168256]
Marginal-likelihood based model-selection is rarely used in deep learning due to estimation difficulties.
Our work shows that marginal likelihoods can improve generalization and be useful when validation data is unavailable.
arXiv Detail & Related papers (2021-04-11T09:50:24Z) - A Hypergradient Approach to Robust Regression without Correspondence [85.49775273716503]
We consider a variant of regression problem, where the correspondence between input and output data is not available.
Most existing methods are only applicable when the sample size is small.
We propose a new computational framework -- ROBOT -- for the shuffled regression problem.
arXiv Detail & Related papers (2020-11-30T21:47:38Z) - A spectral algorithm for robust regression with subgaussian rates [0.0]
We study a new linear up to quadratic time algorithm for linear regression in the absence of strong assumptions on the underlying distributions of samples.
The goal is to design a procedure which attains the optimal sub-gaussian error bound even though the data have only finite moments.
arXiv Detail & Related papers (2020-07-12T19:33:50Z) - Fast OSCAR and OWL Regression via Safe Screening Rules [97.28167655721766]
Ordered $L_1$ (OWL) regularized regression is a new regression analysis for high-dimensional sparse learning.
Proximal gradient methods are used as standard approaches to solve OWL regression.
We propose the first safe screening rule for OWL regression by exploring the order of the primal solution with the unknown order structure.
arXiv Detail & Related papers (2020-06-29T23:35:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.