Resting-state fMRI Analysis using Quantum Time-series Transformer
- URL: http://arxiv.org/abs/2509.00711v1
- Date: Sun, 31 Aug 2025 06:08:57 GMT
- Title: Resting-state fMRI Analysis using Quantum Time-series Transformer
- Authors: Junghoon Justin Park, Jungwoo Seo, Sangyoon Bae, Samuel Yen-Chi Chen, Huan-Hsin Tseng, Jiook Cha, Shinjae Yoo,
- Abstract summary: We introduce a novel quantum-enhanced transformer architecture leveraging Linear Combination of Unitaries and Quantum Singular Value Transformation.<n>Unlike classical transformers, Quantum Time-series Transformer operates with polylogarithmic computational complexity, markedly reducing training overhead and enabling robust performance even with fewer parameters and limited sample sizes.
- Score: 27.94553142822623
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Resting-state functional magnetic resonance imaging (fMRI) has emerged as a pivotal tool for revealing intrinsic brain network connectivity and identifying neural biomarkers of neuropsychiatric conditions. However, classical self-attention transformer models--despite their formidable representational power--struggle with quadratic complexity, large parameter counts, and substantial data requirements. To address these barriers, we introduce a Quantum Time-series Transformer, a novel quantum-enhanced transformer architecture leveraging Linear Combination of Unitaries and Quantum Singular Value Transformation. Unlike classical transformers, Quantum Time-series Transformer operates with polylogarithmic computational complexity, markedly reducing training overhead and enabling robust performance even with fewer parameters and limited sample sizes. Empirical evaluation on the largest-scale fMRI datasets from the Adolescent Brain Cognitive Development Study and the UK Biobank demonstrates that Quantum Time-series Transformer achieves comparable or superior predictive performance compared to state-of-the-art classical transformer models, with especially pronounced gains in small-sample scenarios. Interpretability analyses using SHapley Additive exPlanations further reveal that Quantum Time-series Transformer reliably identifies clinically meaningful neural biomarkers of attention-deficit/hyperactivity disorder (ADHD). These findings underscore the promise of quantum-enhanced transformers in advancing computational neuroscience by more efficiently modeling complex spatio-temporal dynamics and improving clinical interpretability.
Related papers
- Towards Quantum Enhanced Adversarial Robustness with Rydberg Reservoir Learning [45.92935470813908]
Quantum computing reservoir (QRC) leverages the high-dimensional, nonlinear dynamics inherent in quantum many-body systems.<n>Recent studies indicate that perturbation quantums based on variational circuits remain susceptible to adversarials.<n>We investigate the first systematic evaluation of adversarial robustness in a QR based learning model.
arXiv Detail & Related papers (2025-10-15T12:17:23Z) - Demonstration of Efficient Predictive Surrogates for Large-scale Quantum Processors [64.50565018996328]
We introduce the concept of predictive surrogates, designed to emulate the mean-value behavior of a given quantum processor with provably computational efficiency.<n>We use these surrogates to emulate a quantum processor with up to 20 programmable superconducting qubits, enabling efficient pre-training of variational quantum eigensolvers.<n> Experimental results reveal that the predictive surrogates not only reduce measurement overhead by orders of magnitude, but can also surpass the performance of conventional, quantum-resource-intensive approaches.
arXiv Detail & Related papers (2025-07-23T12:51:03Z) - VQC-MLPNet: An Unconventional Hybrid Quantum-Classical Architecture for Scalable and Robust Quantum Machine Learning [60.996803677584424]
Variational Quantum Circuits (VQCs) offer a novel pathway for quantum machine learning.<n>Their practical application is hindered by inherent limitations such as constrained linear expressivity, optimization challenges, and acute sensitivity to quantum hardware noise.<n>This work introduces VQC-MLPNet, a scalable and robust hybrid quantum-classical architecture designed to overcome these obstacles.
arXiv Detail & Related papers (2025-06-12T01:38:15Z) - Genetic Transformer-Assisted Quantum Neural Networks for Optimal Circuit Design [3.6953740776904924]
We introduce Genetic Transformer Assisted Quantum Neural Networks (GTQNNs)<n>GTQNNs are a hybrid learning framework that combines a transformer encoder with a shallow variational quantum circuit.<n>Experiments on four benchmarks show that GTQNNs match or exceed state of the art quantum models.
arXiv Detail & Related papers (2025-06-10T19:49:21Z) - Physics-inspired Energy Transition Neural Network for Sequence Learning [14.111325019623596]
In this study, we explore the capabilities of pure RNNs and reassess their long-term learning mechanisms.<n>Inspired by the physics energy transition models that track energy changes over time, we propose a effective recurrent structure called thePhysics-inspired Energy Transition Neural Network" (PETNN)<n>Our study presents an optimal foundational recurrent architecture and highlights the potential for developing effective recurrent neural networks in fields currently dominated by Transformer.
arXiv Detail & Related papers (2025-05-06T08:07:15Z) - Quantum Adaptive Self-Attention for Quantum Transformer Models [0.0]
We propose Quantum Adaptive Self-Attention (QASA), a novel hybrid architecture that enhances classical Transformer models with a quantum attention mechanism.<n>QASA replaces dot-product attention with a parameterized quantum circuit (PQC) that adaptively captures inter-token relationships in the quantum Hilbert space.<n> Experiments on synthetic time-series tasks demonstrate that QASA achieves faster convergence and superior generalization compared to both standard Transformers and reduced classical variants.
arXiv Detail & Related papers (2025-04-05T02:52:37Z) - Learning with SASQuaTCh: a Novel Variational Quantum Transformer Architecture with Kernel-Based Self-Attention [0.464982780843177]
We present a variational quantum circuit architecture named Self-Attention Sequential Quantum Transformer Channel (SASQuaT)<n>Our approach leverages recent insights from kernel-based operator learning in the context of predicting vision transformer network using simple gate operations and a set of multi-dimensional quantum Fourier transforms.<n>To validate our approach, we consider image classification tasks in simulation and with hardware, where with only 9 qubits and a handful of parameters we are able to simultaneously embed and classify a grayscale image of handwritten digits with high accuracy.
arXiv Detail & Related papers (2024-03-21T18:00:04Z) - Ab-initio variational wave functions for the time-dependent many-electron Schrödinger equation [41.94295877935867]
We introduce a variational approach for fermionic time-dependent wave functions, surpassing mean-field approximations.
We use time-dependent Jastrow factors and backflow transformations, which are enhanced through neural networks parameterizations.
The results showcase the ability of our variational approach to accurately capture the time evolution, providing insight into the quantum dynamics of interacting electronic systems.
arXiv Detail & Related papers (2024-03-12T09:37:22Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Conditional Generative Models for Simulation of EMG During Naturalistic
Movements [45.698312905115955]
We present a conditional generative neural network trained adversarially to generate motor unit activation potential waveforms.
We demonstrate the ability of such a model to predictively interpolate between a much smaller number of numerical model's outputs with a high accuracy.
arXiv Detail & Related papers (2022-11-03T14:49:02Z) - Variational learning of quantum ground states on spiking neuromorphic
hardware [0.0]
High-dimensional sampling spaces and transient autocorrelations confront neural networks with a challenging computational bottleneck.
Compared to conventional neural networks, physical-model devices offer a fast, efficient and inherently parallel substrate.
We demonstrate the ability of a neuromorphic chip to represent the ground states of quantum spin models by variational energy minimization.
arXiv Detail & Related papers (2021-09-30T14:39:45Z) - Post-Training Quantization for Vision Transformer [85.57953732941101]
We present an effective post-training quantization algorithm for reducing the memory storage and computational costs of vision transformers.
We can obtain an 81.29% top-1 accuracy using DeiT-B model on ImageNet dataset with about 8-bit quantization.
arXiv Detail & Related papers (2021-06-27T06:27:22Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.