Fourier Fingerprints of Ansatzes in Quantum Machine Learning
- URL: http://arxiv.org/abs/2508.20868v1
- Date: Thu, 28 Aug 2025 15:00:37 GMT
- Title: Fourier Fingerprints of Ansatzes in Quantum Machine Learning
- Authors: Melvin Strobl, M. Emre Sahin, Lucas van der Horst, Eileen Kuehn, Achim Streit, Ben Jaderberg,
- Abstract summary: We show the existence of correlations between the Fourier modes, which depend on the structure of the circuit.<n>For several popular ansatzes, we compute the Fourier coefficient correlations (FCCs) and construct the Fourier fingerprint.<n>We show how, for the problem of learning random Fourier series, the FCC correctly predicts relative performance of ansatzes whilst the widely-used expressibility metric does not.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Typical schemes to encode classical data in variational quantum machine learning (QML) lead to quantum Fourier models with $\mathcal{O}(\exp(n))$ Fourier basis functions in the number of qubits. Despite this, in order for the model to be efficiently trainable, the number of parameters must scale as $\mathcal{O}(\mathrm{poly}(n))$. This imbalance implies the existence of correlations between the Fourier modes, which depend on the structure of the circuit. In this work, we demonstrate that this phenomenon exists and show cases where these correlations can be used to predict ansatz performance. For several popular ansatzes, we numerically compute the Fourier coefficient correlations (FCCs) and construct the Fourier fingerprint, a visual representation of the correlation structure. We subsequently show how, for the problem of learning random Fourier series, the FCC correctly predicts relative performance of ansatzes whilst the widely-used expressibility metric does not. Finally, we demonstrate how our framework applies to the more challenging problem of jet reconstruction in high-energy physics. Overall, our results demonstrate how the Fourier fingerprint is a powerful new tool in the problem of optimal ansatz choice for QML.
Related papers
- FFT-Accelerated Auxiliary Variable MCMC for Fermionic Lattice Models: A Determinant-Free Approach with $O(N\log N)$ Complexity [52.3171766248012]
We introduce a Markov Chain Monte Carlo (MCMC) algorithm that dramatically accelerates the simulation of quantum many-body systems.<n>We validate our algorithm on benchmark quantum physics problems, accurately reproducing known theoretical results.<n>Our work provides a powerful tool for large-scale probabilistic inference and opens avenues for physics-inspired generative models.
arXiv Detail & Related papers (2025-10-13T07:57:21Z) - Fourier Learning Machines: Nonharmonic Fourier-Based Neural Networks for Scientific Machine Learning [0.5097809301149341]
We introduce the Fourier Learning Machine (FLM), a neural network (NN) architecture designed to represent a multidimensional nonharmonic Fourier series.<n>The FLM uses a simple feedforward structure with cosine activation functions to learn the frequencies, amplitudes, and phase shifts of the series as trainable parameters.<n> Computational experiments show that the performance of FLMs is comparable, and often superior, to that of established architectures like SIREN and vanilla feedforward NNs.
arXiv Detail & Related papers (2025-09-10T16:49:20Z) - Spectral Bias in Variational Quantum Machine Learning [0.0]
We study the effect specifically in parameterised quantum circuits (PQCs)<n>We find that the magnitude of the Fourier coefficients' gradients during training strongly correlates with the coefficients' redundancy.<n>We also demonstrate that PQCs with greater redundancy exhibit increased robustness to random perturbations in their parameters at the corresponding frequencies.
arXiv Detail & Related papers (2025-06-27T18:11:54Z) - Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Fourier Series Guided Design of Quantum Convolutional Neural Networks for Enhanced Time Series Forecasting [0.9060149007362646]
We apply 1D quantum convolution to address the task of time series forecasting.<n>By encoding multiple points into the quantum circuit to predict subsequent data, each point becomes a feature, transforming the problem into a multidimensional one.
arXiv Detail & Related papers (2024-04-23T01:35:07Z) - Constrained and Vanishing Expressivity of Quantum Fourier Models [2.7746258981078196]
We show a new correlation between the Fourier coefficients of the quantum model and its encoding gates.
We also show a phenomenon of vanishing expressivity in certain settings.
These two behaviors imply novel forms of constraints which limit the expressivity of PQCs.
arXiv Detail & Related papers (2024-03-14T14:05:24Z) - Fourier Continuation for Exact Derivative Computation in
Physics-Informed Neural Operators [53.087564562565774]
PINO is a machine learning architecture that has shown promising empirical results for learning partial differential equations.
We present an architecture that leverages Fourier continuation (FC) to apply the exact gradient method to PINO for nonperiodic problems.
arXiv Detail & Related papers (2022-11-29T06:37:54Z) - Deep Fourier Up-Sampling [100.59885545206744]
Up-sampling in the Fourier domain is more challenging as it does not follow such a local property.
We propose a theoretically sound Deep Fourier Up-Sampling (FourierUp) to solve these issues.
arXiv Detail & Related papers (2022-10-11T06:17:31Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Adaptive Fourier Neural Operators: Efficient Token Mixers for
Transformers [55.90468016961356]
We propose an efficient token mixer that learns to mix in the Fourier domain.
AFNO is based on a principled foundation of operator learning.
It can handle a sequence size of 65k and outperforms other efficient self-attention mechanisms.
arXiv Detail & Related papers (2021-11-24T05:44:31Z) - Learning Set Functions that are Sparse in Non-Orthogonal Fourier Bases [73.53227696624306]
We present a new family of algorithms for learning Fourier-sparse set functions.
In contrast to other work that focused on the Walsh-Hadamard transform, our novel algorithms operate with recently introduced non-orthogonal Fourier transforms.
We demonstrate effectiveness on several real-world applications.
arXiv Detail & Related papers (2020-10-01T14:31:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.