AI-enhanced Quantum Simulation of Schwinger Model
- URL: http://arxiv.org/abs/2509.20173v1
- Date: Wed, 24 Sep 2025 14:37:16 GMT
- Title: AI-enhanced Quantum Simulation of Schwinger Model
- Authors: Ao-Ning Wang, Min-Quan He, Z. D. Wang,
- Abstract summary: The Schwinger Model from Quantum Electrodynamics (QED) has long served as a valuable simplified model for exploring key physical phenomena.<n>Here, we propose a model that we refer to as the Neural Network Facilitated Implicit Quantum Simulation (NN-IQS) model as a solution.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Schwinger Model from Quantum Electrodynamics (QED) has long served as a valuable simplified model for exploring key physical phenomena in Quantum Chromodynamics (QCD)-a field rich with fundamental insights but is substantially more complex. While the phase diagram of the Schwinger Model bears extraordinary significance and remains challenging to investigate, recent progress on the model mainly focuses on detailed case studies. Here, we propose a model that we refer as the Neural Network Facilitated Implicit Quantum Simulation (NN-IQS) model as a solution. After training on limited discrete data points on the Schwinger Model phase diagram, the NN-IQS model allows quick generation of extra sample points over a continuous domain. The model can even generalize beyond its training range, maintaining robust performance in previously unexplored parameter space and system sizes.
Related papers
- QiNN-QJ: A Quantum-inspired Neural Network with Quantum Jump for Multimodal Sentiment Analysis [11.46663985298648]
We propose a Quantum-inspired Neural Network with Quantum Jump (QiNN-QJ) for multimodal entanglement modelling.<n>By jointly Hamiltonian and Lindblad operators, QiNN-QJ generates controllable cross-modal entanglement.<n>This work establishes a principled framework for entangled multimodal fusion and paves the way for quantum-inspired approaches in modelling complex cross-modal correlations.
arXiv Detail & Related papers (2025-10-31T01:25:55Z) - Physics-informed time series analysis with Kolmogorov-Arnold Networks under Ehrenfest constraints [0.0]
Predictions of quantum dynamical responses lie at the heart of modern physics.<n>Quantum dynamics presents a fundamentally different challenge: forecasting the entire temporal evolution of quantum systems.<n>We introduce a fundamentally new approach: Kolmogorov Arnold Networks (KANs) augmented with physics-informed loss functions that enforce the Ehrenfest theorems.
arXiv Detail & Related papers (2025-09-23T00:37:04Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Solving the Hubbard model with Neural Quantum States [66.55653324211542]
We study the state-of-the-art results for the doped two-dimensional (2D) Hubbard model.<n>We find different attention heads in the NQS ansatz can directly encode correlations at different scales.<n>Our work establishes NQS as a powerful tool for solving challenging many-fermions systems.
arXiv Detail & Related papers (2025-07-03T14:08:25Z) - Are Foundational Atomistic Models Reliable for Finite-Temperature Molecular Dynamics? [5.017458218949553]
Machine learning force fields have emerged as promising tools for molecular dynamics (MD) simulations.<n>This Perspective adopts a practitioner's viewpoint to ask a critical question: Are these foundational atomistic models reliable for one of their most compelling applications?
arXiv Detail & Related papers (2025-03-11T09:23:01Z) - A Quantum Neural Network Transfer-Learning Model for Forecasting Problems with Continuous and Discrete Variables [0.0]
This study introduces simple yet effective continuous- and discrete-variable quantum neural network (QNN) models as a transfer-learning approach for forecasting tasks.<n>The CV-QNN features a single quantum layer with two qubits to establish entanglement and utilizes a minimal set of quantum gates.<n>The model's frozen parameters are successfully applied to various forecasting tasks, including energy consumption, traffic flow, weather conditions, and cryptocurrency price prediction.
arXiv Detail & Related papers (2025-03-04T22:38:51Z) - Recurrent convolutional neural networks for modeling non-adiabatic dynamics of quantum-classical systems [1.23088383881821]
We present a RNN model based on convolution neural networks for modeling the non-adiabatic dynamics of hybrid quantum-classical systems.<n>We demonstrate that the PARC-CNN architecture can effectively learn the statistical climate of the Holstein model under deep-quench conditions.
arXiv Detail & Related papers (2024-12-09T16:23:25Z) - Classical Benchmarks for Variational Quantum Eigensolver Simulations of the Hubbard Model [1.1017516493649393]
We show that the error in its ground state energy and wavefunction plateaus for larger lattices, while stronger electronic correlations magnify this issue.<n>Our study highlights the capabilities and limitations of current approaches for solving the Hubbard model on quantum hardware.
arXiv Detail & Related papers (2024-08-01T18:00:04Z) - Contextualizing MLP-Mixers Spatiotemporally for Urban Data Forecast at Scale [54.15522908057831]
We propose an adapted version of the computationally-Mixer for STTD forecast at scale.
Our results surprisingly show that this simple-yeteffective solution can rival SOTA baselines when tested on several traffic benchmarks.
Our findings contribute to the exploration of simple-yet-effective models for real-world STTD forecasting.
arXiv Detail & Related papers (2023-07-04T05:19:19Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Dynamics with autoregressive neural quantum states: application to
critical quench dynamics [41.94295877935867]
We present an alternative general scheme that enables one to capture long-time dynamics of quantum systems in a stable fashion.
We apply the scheme to time-dependent quench dynamics by investigating the Kibble-Zurek mechanism in the two-dimensional quantum Ising model.
arXiv Detail & Related papers (2022-09-07T15:50:00Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Kernel and Rich Regimes in Overparametrized Models [69.40899443842443]
We show that gradient descent on overparametrized multilayer networks can induce rich implicit biases that are not RKHS norms.
We also demonstrate this transition empirically for more complex matrix factorization models and multilayer non-linear networks.
arXiv Detail & Related papers (2020-02-20T15:43:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.