Quantum Machine Learning Applied to the Sinking of the Titanic
- URL: http://arxiv.org/abs/2509.00916v1
- Date: Sun, 31 Aug 2025 16:00:52 GMT
- Title: Quantum Machine Learning Applied to the Sinking of the Titanic
- Authors: Luiz Henrique Prudencio dos Santos, Eliane F. Chinaglia, Jessica Fleury Curado, Marcilei A. Guazzelli, Mariana Pojar, Sueli Hatsumi Masunaga, Roberto Baginski Batista Santos,
- Abstract summary: Quantum models were constructed using Pauli entangling and non-entangling expansion-based feature maps and the RealAmplitudes ansatz with up to 50 variational parameters.<n>Model training employed the COBYLA gradient-free framework to minimize the cross-entropy loss.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work investigates the performance of hybrid quantum-classical variational classifiers applied to a supervised learning task involving the titanic3 dataset. Quantum models were constructed using Pauli entangling and non-entangling expansion-based feature maps and the RealAmplitudes ansatz with up to 50 variational parameters. Model training employed the COBYLA gradient-free optimizer to minimize the cross-entropy loss, within an ideal statevector simulation framework. Comparative performance analysis reveals that the models based on the non-entangling feature map consistently outperformed the models based on the entangling features maps, achieving saturation of classification metrics (accuracy, balanced accuracy, and Youden's index) beyond 15 to 20 parameters. Further, two quantum models were benchmarked against a classical Support Vector Classifier (SVC). While both approaches yielded similar predictive performance across multiple training sizes, the classical model exhibited a performance collapse when trained with 90% of the dataset, a failure mode absent in the quantum classifiers. These results underscore the robustness and viability of variational quantum classifiers for binary classification tasks on classical datasets in the NISQ era.
Related papers
- Comparing Classical and Quantum Variational Classifiers on the XOR Problem [0.0]
Variational quantum models operate on qubits in high-dimensional Hilbert spaces.<n>We compare classical models and a variational quantum classifier on the XOR problem.
arXiv Detail & Related papers (2026-02-27T17:46:52Z) - Evaluating Angle and Amplitude Encoding Strategies for Variational Quantum Machine Learning: their impact on model's accuracy [0.6553587309274792]
Variational Quantum Circuit (VQC) is a hybrid model where the quantum circuit handles data inference while classical optimization adjusts the parameters of the circuit.<n>This work involves performing an analysis by considering both Amplitude- and Angle-encoding models, and examining how the type of rotational gate applied affects the classification performance of the model.<n>The study demonstrates that, under identical model topologies, the difference in accuracy between the best and worst models ranges from 10% to 30%, with differences reaching up to 41%.
arXiv Detail & Related papers (2025-08-01T16:43:45Z) - Quantum Neural Networks in Practice: A Comparative Study with Classical Models from Standard Data Sets to Industrial Images [0.5892638927736115]
We compare the performance of randomized classical and quantum neural networks (NNs) as well as classical and quantum-classical hybrid convolutional neural networks (CNNs) for the task of binary image classification.<n>We evaluate these approaches on three data sets of increasing complexity.<n>Cross-dataset performance analysis revealed limited transferability of quantum models between different classification tasks.
arXiv Detail & Related papers (2024-11-28T17:13:45Z) - The Languini Kitchen: Enabling Language Modelling Research at Different
Scales of Compute [66.84421705029624]
We introduce an experimental protocol that enables model comparisons based on equivalent compute, measured in accelerator hours.
We pre-process an existing large, diverse, and high-quality dataset of books that surpasses existing academic benchmarks in quality, diversity, and document length.
This work also provides two baseline models: a feed-forward model derived from the GPT-2 architecture and a recurrent model in the form of a novel LSTM with ten-fold throughput.
arXiv Detail & Related papers (2023-09-20T10:31:17Z) - Kalman Filter for Online Classification of Non-Stationary Data [101.26838049872651]
In Online Continual Learning (OCL) a learning system receives a stream of data and sequentially performs prediction and training steps.
We introduce a probabilistic Bayesian online learning model by using a neural representation and a state space model over the linear predictor weights.
In experiments in multi-class classification we demonstrate the predictive ability of the model and its flexibility to capture non-stationarity.
arXiv Detail & Related papers (2023-06-14T11:41:42Z) - ClusterQ: Semantic Feature Distribution Alignment for Data-Free
Quantization [111.12063632743013]
We propose a new and effective data-free quantization method termed ClusterQ.
To obtain high inter-class separability of semantic features, we cluster and align the feature distribution statistics.
We also incorporate the intra-class variance to solve class-wise mode collapse.
arXiv Detail & Related papers (2022-04-30T06:58:56Z) - Dynamically-Scaled Deep Canonical Correlation Analysis [77.34726150561087]
Canonical Correlation Analysis (CCA) is a method for feature extraction of two views by finding maximally correlated linear projections of them.
We introduce a novel dynamic scaling method for training an input-dependent canonical correlation model.
arXiv Detail & Related papers (2022-03-23T12:52:49Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Binary classifiers for noisy datasets: a comparative study of existing
quantum machine learning frameworks and some new approaches [0.0]
We apply Quantum Machine Learning frameworks to improve binary classification.
noisy datasets are in financial datasets.
New models exhibit better learning characteristics to asymmetrical noise in the dataset.
arXiv Detail & Related papers (2021-11-05T10:29:05Z) - Quantum Machine Learning with SQUID [64.53556573827525]
We present the Scaled QUantum IDentifier (SQUID), an open-source framework for exploring hybrid Quantum-Classical algorithms for classification problems.
We provide examples of using SQUID in a standard binary classification problem from the popular MNIST dataset.
arXiv Detail & Related papers (2021-04-30T21:34:11Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Anomaly detection with variational quantum generative adversarial
networks [0.0]
Generative adversarial networks (GANs) are a machine learning framework comprising a generative model for sampling from a target distribution.
We introduce variational quantum-classical Wasserstein GANs to address these issues and embed this model in a classical machine learning framework for anomaly detection.
Our model replaces the generator of Wasserstein GANs with a hybrid quantum-classical neural net and leaves the classical discriminative model unchanged.
arXiv Detail & Related papers (2020-10-20T17:48:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.