Regression of Functions by Quantum Neural Networks Circuits
- URL: http://arxiv.org/abs/2512.19978v1
- Date: Tue, 23 Dec 2025 01:58:03 GMT
- Title: Regression of Functions by Quantum Neural Networks Circuits
- Authors: Fernando M. de Paula Neto, Lucas dos Reis Silva, Paulo S. G. de Mattos Neto, Felipe F. Fanchini,
- Abstract summary: This work investigates automated quantum-circuit construction for regression tasks.<n>It introduces a genetic-algorithm framework that discovers Reduced Regressor QNN architectures.
- Score: 41.09105068326236
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The performance of quantum neural network models depends strongly on architectural decisions, including circuit depth, placement of parametrized operations, and data-encoding strategies. Selecting an effective architecture is challenging and closely related to the classical difficulty of choosing suitable neural-network topologies, which is computationally hard. This work investigates automated quantum-circuit construction for regression tasks and introduces a genetic-algorithm framework that discovers Reduced Regressor QNN architectures. The approach explores depth, parametrized gate configurations, and flexible data re-uploading patterns, formulating the construction of quantum regressors as an optimization process. The discovered circuits are evaluated against seventeen classical regression models on twenty-two nonlinear benchmark functions and four analytical functions. Although classical methods often achieve comparable results, they typically require far more parameters, whereas the evolved quantum models remain compact while providing competitive performance. We further analyze dataset complexity using twelve structural descriptors and show, across five increasingly challenging meta-learning scenarios, that these measures can reliably predict which quantum architecture will perform best. The results demonstrate perfect or near-perfect predictive accuracy in several scenarios, indicating that complexity metrics offer powerful and compact representations of dataset structure and can effectively guide automated model selection. Overall, this study provides a principled basis for meta-learning-driven quantum architecture design and advances the understanding of how quantum models behave in regression settings--a topic that has received limited exploration in prior work. These findings pave the way for more systematic and theoretically grounded approaches to quantum regression.
Related papers
- Probabilistic Design of Parametrized Quantum Circuits through Local Gate Modifications [40.28072745340568]
We propose an evolution-inspired quantum architecture search algorithm, which we refer to as the local quantum architecture search.<n>The goal of the local quantum architecture search algorithm is to optimize parametrized quantum circuit architectures.<n>We evaluate the local quantum architecture search algorithm on two synthetic function-fitting regression tasks and two quantum chemistry regression datasets.
arXiv Detail & Related papers (2026-02-12T22:47:03Z) - Quantum Qualifiers for Neural Network Model Selection in Hadronic Physics [0.0]
We develop tools that guide model selection between classical and quantum deep neural networks based on intrinsic properties of the data.<n>We show how relative model performance follows systematic trends in complexity, noise, and dimensionality, and how these trends can be distilled into a predictive criterion.
arXiv Detail & Related papers (2026-01-19T23:37:31Z) - A Comprehensively Adaptive Architectural Optimization-Ingrained Quantum Neural Network Model for Cloud Workloads Prediction [4.501295034557007]
This work proposes a novel Comprehensively Adaptive Architectural Optimization-based Variable Quantum Neural Network (CA-QNN)<n>The model converts workload data into qubits, processed through qubit neurons with Controlled NOT-gated activation functions for intuitive pattern recognition.<n>The proposed model demonstrates superior prediction accuracy, reducing prediction errors by up to 93.40% and 91.27% compared to existing deep learning and QNN-based approaches.
arXiv Detail & Related papers (2025-07-11T05:07:21Z) - Selective Feature Re-Encoded Quantum Convolutional Neural Network with Joint Optimization for Image Classification [3.8876018618878585]
Quantum convolutional neural networks (QCNNs) have demonstrated promising results in classifying both quantum and classical data.<n>This study proposes a novel strategy to enhance feature processing and a QCNN architecture for improved classification accuracy.
arXiv Detail & Related papers (2025-07-02T18:51:56Z) - Segmentation-Based Regression for Quantum Neural Networks [0.0]
Recent advances in quantum hardware motivate the development of algorithmic frameworks that integrate quantum sampling with classical inference.<n>This work introduces a segmentation-based regression method tailored to quantum neural networks (QNNs)<n>By casting the regression task as a constrained problem over a structured digit lattice, the method replaces continuous inference with interpretable and tractable updates.
arXiv Detail & Related papers (2025-06-27T20:11:43Z) - Scalable Quantum Architecture Search via Landscape Analysis [28.48505903998775]
quantum architecture search (QAS) plays a pivotal role in variational quantum computing.<n>We introduce a scalable, training-free QAS framework that efficiently explores and evaluates quantum circuits.<n>Our framework attains robust performance on a challenging 50-qubit quantum many-body simulation.
arXiv Detail & Related papers (2025-05-08T16:13:23Z) - Generalized Factor Neural Network Model for High-dimensional Regression [50.554377879576066]
We tackle the challenges of modeling high-dimensional data sets with latent low-dimensional structures hidden within complex, non-linear, and noisy relationships.<n>Our approach enables a seamless integration of concepts from non-parametric regression, factor models, and neural networks for high-dimensional regression.
arXiv Detail & Related papers (2025-02-16T23:13:55Z) - Quantum Convolutional Neural Network: A Hybrid Quantum-Classical Approach for Iris Dataset Classification [0.0]
We present a hybrid quantum-classical machine learning model for classification tasks, integrating a 4-qubit quantum circuit with a classical neural network.
The model was trained over 20 epochs, achieving a perfect 100% accuracy on the Iris dataset test set on 16 epoch.
This work contributes to the growing body of research on hybrid quantum-classical models and their applicability to real-world datasets.
arXiv Detail & Related papers (2024-10-21T13:15:12Z) - Principled Architecture-aware Scaling of Hyperparameters [69.98414153320894]
Training a high-quality deep neural network requires choosing suitable hyperparameters, which is a non-trivial and expensive process.
In this work, we precisely characterize the dependence of initializations and maximal learning rates on the network architecture.
We demonstrate that network rankings can be easily changed by better training networks in benchmarks.
arXiv Detail & Related papers (2024-02-27T11:52:49Z) - Tensor Networks or Decision Diagrams? Guidelines for Classical Quantum
Circuit Simulation [65.93830818469833]
tensor networks and decision diagrams have independently been developed with differing perspectives, terminologies, and backgrounds in mind.
We consider how these techniques approach classical quantum circuit simulation, and examine their (dis)similarities with regard to their most applicable abstraction level.
We provide guidelines for when to better use tensor networks and when to better use decision diagrams in classical quantum circuit simulation.
arXiv Detail & Related papers (2023-02-13T19:00:00Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Redefining Neural Architecture Search of Heterogeneous Multi-Network
Models by Characterizing Variation Operators and Model Components [71.03032589756434]
We investigate the effect of different variation operators in a complex domain, that of multi-network heterogeneous neural models.
We characterize both the variation operators, according to their effect on the complexity and performance of the model; and the models, relying on diverse metrics which estimate the quality of the different parts composing it.
arXiv Detail & Related papers (2021-06-16T17:12:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.