A Novel XAI-Enhanced Quantum Adversarial Networks for Velocity Dispersion Modeling in MaNGA Galaxies
- URL: http://arxiv.org/abs/2510.24598v1
- Date: Tue, 28 Oct 2025 16:27:10 GMT
- Title: A Novel XAI-Enhanced Quantum Adversarial Networks for Velocity Dispersion Modeling in MaNGA Galaxies
- Authors: Sathwik Narkedimilli, N V Saran Kumar, Aswath Babu H, Manjunath K Vanahalli, Manish M, Vinija Jain, Aman Chadha,
- Abstract summary: We propose a novel quantum adversarial framework that integrates a hybrid quantum neural network (QNN) with classical deep learning layers.<n>In the proposed model, an adversarial evaluator concurrently guides the QNN by computing feedback loss, thereby optimizing both prediction accuracy and model explainability.<n> Empirical evaluations show that the Vanilla model achieves RMSE = 0.27, MSE = 0.071, MAE = 0.21, and R2 = 0.59, delivering the most consistent performance across regression metrics compared to adversarial counterparts.
- Score: 14.016108312641101
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current quantum machine learning approaches often face challenges balancing predictive accuracy, robustness, and interpretability. To address this, we propose a novel quantum adversarial framework that integrates a hybrid quantum neural network (QNN) with classical deep learning layers, guided by an evaluator model with LIME-based interpretability, and extended through quantum GAN and self-supervised variants. In the proposed model, an adversarial evaluator concurrently guides the QNN by computing feedback loss, thereby optimizing both prediction accuracy and model explainability. Empirical evaluations show that the Vanilla model achieves RMSE = 0.27, MSE = 0.071, MAE = 0.21, and R^2 = 0.59, delivering the most consistent performance across regression metrics compared to adversarial counterparts. These results demonstrate the potential of combining quantum-inspired methods with classical architectures to develop lightweight, high-performance, and interpretable predictive models, advancing the applicability of QML beyond current limitations.
Related papers
- Towards Quantum Enhanced Adversarial Robustness with Rydberg Reservoir Learning [45.92935470813908]
Quantum computing reservoir (QRC) leverages the high-dimensional, nonlinear dynamics inherent in quantum many-body systems.<n>Recent studies indicate that perturbation quantums based on variational circuits remain susceptible to adversarials.<n>We investigate the first systematic evaluation of adversarial robustness in a QR based learning model.
arXiv Detail & Related papers (2025-10-15T12:17:23Z) - From Distributional to Quantile Neural Basis Models: the case of Electricity Price Forecasting [42.062078728472734]
We introduce the Quantile Neural Basis Model, which incorporates the interpretability principles of Quantile Generalized Additive Models.<n>We validate our approach on day-ahead electricity price forecasting, achieving predictive performance comparable to distributional and quantile regression neural networks.
arXiv Detail & Related papers (2025-09-17T15:55:59Z) - Hybrid Quantum-Classical Neural Networks for Few-Shot Credit Risk Assessment [52.05742536403784]
This work tackles the challenge of few-shot credit risk assessment.<n>We design and implement a novel hybrid quantum-classical workflow.<n>A Quantum Neural Network (QNN) was trained via the parameter-shift rule.<n>On a real-world credit dataset of 279 samples, our QNN achieved a robust average AUC of 0.852 +/- 0.027 in simulations and yielded an impressive AUC of 0.88 in the hardware experiment.
arXiv Detail & Related papers (2025-09-17T08:36:05Z) - TensoMeta-VQC: A Tensor-Train-Guided Meta-Learning Framework for Robust and Scalable Variational Quantum Computing [60.996803677584424]
TensoMeta-VQC is a novel tensor-train (TT)-guided meta-learning framework designed to improve the robustness and scalability of VQC significantly.<n>Our framework fully delegates the generation of quantum circuit parameters to a classical TT network, effectively decoupling optimization from quantum hardware.
arXiv Detail & Related papers (2025-08-01T23:37:55Z) - Quantum-Informed Machine Learning for Predicting Spatiotemporal Chaos [1.5216516276847551]
We introduce a quantum-informed machine learning (QIML) framework for the long-term dynamical behavior of high-dimensional chaotic systems.<n>We evaluate QIML on three representative systems: the Kuramoto-Sivashinsky equation, the two-dimensional Kolmogorov flow, and a cross-section of fully developed three-dimensional turbulent channel flow.
arXiv Detail & Related papers (2025-07-26T08:36:16Z) - VQC-MLPNet: An Unconventional Hybrid Quantum-Classical Architecture for Scalable and Robust Quantum Machine Learning [50.95799256262098]
Variational quantum circuits (VQCs) hold promise for quantum machine learning but face challenges in expressivity, trainability, and noise resilience.<n>We propose VQC-MLPNet, a hybrid architecture where a VQC generates the first-layer weights of a classical multilayer perceptron during training, while inference is performed entirely classically.
arXiv Detail & Related papers (2025-06-12T01:38:15Z) - KITINet: Kinetics Theory Inspired Network Architectures with PDE Simulation Approaches [43.872190335490515]
This paper introduces KITINet, a novel architecture that reinterprets feature propagation through the lens of non-equilibrium particle dynamics.<n>At its core, we propose a residual module that models update as the evolution of a particle system.<n>This formulation mimics particle collisions and energy exchange, enabling adaptive feature refinement via physics-informed interactions.
arXiv Detail & Related papers (2025-05-23T13:58:29Z) - HQNN-FSP: A Hybrid Classical-Quantum Neural Network for Regression-Based Financial Stock Market Prediction [3.5418331252013897]
This study explores the potential of hybrid quantum-classical approaches to assist in financial trend prediction.<n>A custom Quantum Neural Network (QNN) regressor is introduced, designed with a novel ansatz tailored for financial applications.
arXiv Detail & Related papers (2025-03-19T16:44:21Z) - Hybrid Quantum Neural Networks with Variational Quantum Regressor for Enhancing QSPR Modeling of CO2-Capturing Amine [0.9968037829925945]
We develop hybrid quantum neural networks (HQNN) to improve structure-property relationship modeling for CO2-capturing amines.<n>HQNNs improve predictive accuracy for key solvent properties, including basicity, viscosity, boiling point, melting point, and vapor pressure.
arXiv Detail & Related papers (2025-03-01T07:26:45Z) - Hybrid Quantum Neural Networks with Amplitude Encoding: Advancing Recovery Rate Predictions [6.699192644249841]
Recovery rate prediction plays a pivotal role in bond investment strategies by enhancing risk assessment, optimizing portfolio allocation, and improving pricing accuracy.<n>We propose a hybrid Quantum Machine Learning (QML) model with Amplitude.<n>We evaluate the model on a global recovery rate dataset comprising 1,725 observations from 1996 to 2023.
arXiv Detail & Related papers (2025-01-27T07:27:23Z) - Leveraging Pre-Trained Neural Networks to Enhance Machine Learning with Variational Quantum Circuits [48.33631905972908]
We introduce an innovative approach that utilizes pre-trained neural networks to enhance Variational Quantum Circuits (VQC)
This technique effectively separates approximation error from qubit count and removes the need for restrictive conditions.
Our results extend to applications such as human genome analysis, demonstrating the broad applicability of our approach.
arXiv Detail & Related papers (2024-11-13T12:03:39Z) - Computable Model-Independent Bounds for Adversarial Quantum Machine Learning [4.857505043608425]
We introduce the first of an approximate lower bound for adversarial error when evaluating model resilience against quantum-based adversarial attacks.
In the best case, the experimental error is only 10% above the estimated bound, offering evidence of the inherent robustness of quantum models.
arXiv Detail & Related papers (2024-11-11T10:56:31Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.