Quantum Annealing for Automated Feature Selection in Stress Detection
- URL: http://arxiv.org/abs/2106.05134v1
- Date: Wed, 9 Jun 2021 15:17:48 GMT
- Title: Quantum Annealing for Automated Feature Selection in Stress Detection
- Authors: Rajdeep Kumar Nath, Himanshu Thapliyal, Travis S. Humble
- Abstract summary: We present a novel methodology for automated feature subset selection from a pool of physiological signals using Quantum Annealing (QA)
Features are extracted from four signal sources: foot EDA, hand EDA, ECG, and respiration.
Results indicate that QA-based feature subset selection performed equally as that of classical techniques.
- Score: 1.8047694351309205
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a novel methodology for automated feature subset selection from a
pool of physiological signals using Quantum Annealing (QA). As a case study, we
will investigate the effectiveness of QA-based feature selection techniques in
selecting the optimal feature subset for stress detection. Features are
extracted from four signal sources: foot EDA, hand EDA, ECG, and respiration.
The proposed method embeds the feature variables extracted from the
physiological signals in a binary quadratic model. The bias of the feature
variable is calculated using the Pearson correlation coefficient between the
feature variable and the target variable. The weight of the edge connecting the
two feature variables is calculated using the Pearson correlation coefficient
between two feature variables in the binary quadratic model. Subsequently,
D-Wave's clique sampler is used to sample cliques from the binary quadratic
model. The underlying solution is then re-sampled to obtain multiple good
solutions and the clique with the lowest energy is returned as the optimal
solution. The proposed method is compared with commonly used feature selection
techniques for stress detection. Results indicate that QA-based feature subset
selection performed equally as that of classical techniques. However, under
data uncertainty conditions such as limited training data, the performance of
quantum annealing for selecting optimum features remained unaffected, whereas a
significant decrease in performance is observed with classical feature
selection techniques. Preliminary results show the promise of quantum annealing
in optimizing the training phase of a machine learning classifier, especially
under data uncertainty conditions.
Related papers
- Knoop: Practical Enhancement of Knockoff with Over-Parameterization for Variable Selection [27.563529091471935]
This work introduces a novel approach namely Knockoff with over- parameterization (Knoop) to enhance variable selection.
Knoop generates multiple knockoff variables for each original variable and integrates them with the original variables into a Ridgeless regression model.
Experiments demonstrate superior performance compared to existing methods in both simulation and real-world datasets.
arXiv Detail & Related papers (2025-01-28T09:27:04Z) - A Hybrid Framework for Statistical Feature Selection and Image-Based Noise-Defect Detection [55.2480439325792]
This paper presents a hybrid framework that integrates both statistical feature selection and classification techniques to improve defect detection accuracy.
We present around 55 distinguished features that are extracted from industrial images, which are then analyzed using statistical methods.
By integrating these methods with flexible machine learning applications, the proposed framework improves detection accuracy and reduces false positives and misclassifications.
arXiv Detail & Related papers (2024-12-11T22:12:21Z) - Accelerated zero-order SGD under high-order smoothness and overparameterized regime [79.85163929026146]
We present a novel gradient-free algorithm to solve convex optimization problems.
Such problems are encountered in medicine, physics, and machine learning.
We provide convergence guarantees for the proposed algorithm under both types of noise.
arXiv Detail & Related papers (2024-11-21T10:26:17Z) - A Performance-Driven Benchmark for Feature Selection in Tabular Deep
Learning [131.2910403490434]
Data scientists typically collect as many features as possible into their datasets, and even engineer new features from existing ones.
Existing benchmarks for tabular feature selection consider classical downstream models, toy synthetic datasets, or do not evaluate feature selectors on the basis of downstream performance.
We construct a challenging feature selection benchmark evaluated on downstream neural networks including transformers.
We also propose an input-gradient-based analogue of Lasso for neural networks that outperforms classical feature selection methods on challenging problems.
arXiv Detail & Related papers (2023-11-10T05:26:10Z) - Causal Feature Selection via Transfer Entropy [59.999594949050596]
Causal discovery aims to identify causal relationships between features with observational data.
We introduce a new causal feature selection approach that relies on the forward and backward feature selection procedures.
We provide theoretical guarantees on the regression and classification errors for both the exact and the finite-sample cases.
arXiv Detail & Related papers (2023-10-17T08:04:45Z) - Fermionic Adaptive Sampling Theory for Variational Quantum Eigensolvers [0.0]
ADAPT-VQE suffers from a significant measurement overhead when estimating the importance of operators in the wave function.
We proposeFAST-VQE, a method for selecting operators based on importance metrics solely derived from the populations of Slater determinants in the wave function.
arXiv Detail & Related papers (2023-03-13T18:57:18Z) - An Advantage Using Feature Selection with a Quantum Annealer [0.0]
Feature selection is a technique in statistical prediction modeling that identifies features in a record with a strong statistical connection to the target variable.
This paper tests this intuition against classical methods by utilizing open-source data sets and evaluate the efficacy of each trained statistical model.
arXiv Detail & Related papers (2022-11-17T18:32:26Z) - Feature Selection for Classification with QAOA [11.516147824168732]
Feature selection is of great importance in Machine Learning, where it can be used to reduce the dimensionality of classification, ranking and prediction problems.
We consider in particular a quadratic feature selection problem that can be tackled with the Approximate Quantum Algorithm Optimization (QAOA), already employed in optimization.
In our experiments, we consider seven different real-world datasets with dimensionality up to 21 and run QAOA on both a quantum simulator and, for small datasets, the 7-qubit IBM (ibm-perth) quantum computer.
arXiv Detail & Related papers (2022-11-05T09:28:53Z) - Quantum Feature Selection [2.5934039615414615]
In machine learning, fewer features reduce model complexity.
We propose a novel feature selection algorithm based on a quadratic unconstrained binary optimization problem.
In contrast to iterative or greedy methods, our direct approach yields higherquality solutions.
arXiv Detail & Related papers (2022-03-24T16:22:25Z) - Deep Learning for the Benes Filter [91.3755431537592]
We present a new numerical method based on the mesh-free neural network representation of the density of the solution of the Benes model.
We discuss the role of nonlinearity in the filtering model equations for the choice of the domain of the neural network.
arXiv Detail & Related papers (2022-03-09T14:08:38Z) - Meta-Solver for Neural Ordinary Differential Equations [77.8918415523446]
We investigate how the variability in solvers' space can improve neural ODEs performance.
We show that the right choice of solver parameterization can significantly affect neural ODEs models in terms of robustness to adversarial attacks.
arXiv Detail & Related papers (2021-03-15T17:26:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.