Physics-informed time series analysis with Kolmogorov-Arnold Networks under Ehrenfest constraints
- URL: http://arxiv.org/abs/2509.18483v1
- Date: Tue, 23 Sep 2025 00:37:04 GMT
- Title: Physics-informed time series analysis with Kolmogorov-Arnold Networks under Ehrenfest constraints
- Authors: Abhijit Sen, Illya V. Lukin, Kurt Jacobs, Lev Kaplan, Andrii G. Sotnikov, Denys I. Bondar,
- Abstract summary: Predictions of quantum dynamical responses lie at the heart of modern physics.<n>Quantum dynamics presents a fundamentally different challenge: forecasting the entire temporal evolution of quantum systems.<n>We introduce a fundamentally new approach: Kolmogorov Arnold Networks (KANs) augmented with physics-informed loss functions that enforce the Ehrenfest theorems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The prediction of quantum dynamical responses lies at the heart of modern physics. Yet, modeling these time-dependent behaviors remains a formidable challenge because quantum systems evolve in high-dimensional Hilbert spaces, often rendering traditional numerical methods computationally prohibitive. While large language models have achieved remarkable success in sequential prediction, quantum dynamics presents a fundamentally different challenge: forecasting the entire temporal evolution of quantum systems rather than merely the next element in a sequence. Existing neural architectures such as recurrent and convolutional networks often require vast training datasets and suffer from spurious oscillations that compromise physical interpretability. In this work, we introduce a fundamentally new approach: Kolmogorov Arnold Networks (KANs) augmented with physics-informed loss functions that enforce the Ehrenfest theorems. Our method achieves superior accuracy with significantly less training data: it requires only 5.4 percent of the samples (200) compared to Temporal Convolution Networks (3,700). We further introduce the Chain of KANs, a novel architecture that embeds temporal causality directly into the model design, making it particularly well-suited for time series modeling. Our results demonstrate that physics-informed KANs offer a compelling advantage over conventional black-box models, maintaining both mathematical rigor and physical consistency while dramatically reducing data requirements.
Related papers
- AI-enhanced Quantum Simulation of Schwinger Model [0.0]
The Schwinger Model from Quantum Electrodynamics (QED) has long served as a valuable simplified model for exploring key physical phenomena.<n>Here, we propose a model that we refer to as the Neural Network Facilitated Implicit Quantum Simulation (NN-IQS) model as a solution.
arXiv Detail & Related papers (2025-09-24T14:37:16Z) - Quantum-Enhanced Channel Mixing in RWKV Models for Time Series Forecasting [0.0]
Recent advancements in neural sequence modeling have led to architectures such as RWKV, which combine recurrent-style time mixing with feedforward channel mixing to enable efficient long-context processing.<n>In this work, we propose QuantumRWKV, a hybrid quantum-Piece extension of the RWKV model, where the standard feedforward network (FFN) is partially replaced by a variational quantum circuit (VQC)<n>The quantum component is designed to enhance nonlinear representational capacity while preserving end-to-end differentiability via the PennyLane framework.
arXiv Detail & Related papers (2025-05-18T02:19:30Z) - PINP: Physics-Informed Neural Predictor with latent estimation of fluid flows [11.102585080028945]
We propose a new physics-informed learning approach that incorporates coupled physical quantities into the prediction process.<n>By incorporating physical equations, our model demonstrates temporal extrapolation and spatial generalization capabilities.
arXiv Detail & Related papers (2025-04-08T14:11:01Z) - Many-body dynamics with explicitly time-dependent neural quantum states [0.0]
We introduce the time-dependent neural quantum state (t-NQS)<n>We optimize a single, time-independent set of parameters to solve the time-dependent Schr"odinger equation across an entire time interval.<n>Results establish t-NQS as a powerful framework for exploring quantum dynamics in strongly correlated systems.
arXiv Detail & Related papers (2024-12-16T14:53:26Z) - QIANets: Quantum-Integrated Adaptive Networks for Reduced Latency and Improved Inference Times in CNN Models [2.6663666678221376]
Convolutional neural networks (CNNs) have made significant advances in computer vision tasks, yet their high inference times and latency limit real-world applicability.
We introduce QIANets: a novel approach of redesigning the traditional GoogLeNet, DenseNet, and ResNet-18 model architectures to process more parameters and computations whilst maintaining low inference times.
Despite experimental limitations, the method was tested and evaluated, demonstrating reductions in inference times, along with effective accuracy preservations.
arXiv Detail & Related papers (2024-10-14T09:24:48Z) - Contextualizing MLP-Mixers Spatiotemporally for Urban Data Forecast at Scale [54.15522908057831]
We propose an adapted version of the computationally-Mixer for STTD forecast at scale.
Our results surprisingly show that this simple-yeteffective solution can rival SOTA baselines when tested on several traffic benchmarks.
Our findings contribute to the exploration of simple-yet-effective models for real-world STTD forecasting.
arXiv Detail & Related papers (2023-07-04T05:19:19Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Dynamics with autoregressive neural quantum states: application to
critical quench dynamics [41.94295877935867]
We present an alternative general scheme that enables one to capture long-time dynamics of quantum systems in a stable fashion.
We apply the scheme to time-dependent quench dynamics by investigating the Kibble-Zurek mechanism in the two-dimensional quantum Ising model.
arXiv Detail & Related papers (2022-09-07T15:50:00Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Quantum algorithms for quantum dynamics: A performance study on the
spin-boson model [68.8204255655161]
Quantum algorithms for quantum dynamics simulations are traditionally based on implementing a Trotter-approximation of the time-evolution operator.
variational quantum algorithms have become an indispensable alternative, enabling small-scale simulations on present-day hardware.
We show that, despite providing a clear reduction of quantum gate cost, the variational method in its current implementation is unlikely to lead to a quantum advantage.
arXiv Detail & Related papers (2021-08-09T18:00:05Z) - ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations [86.41674945012369]
We develop a scalable and expressive Graph Neural Networks model, ForceNet, to approximate atomic forces.
Our proposed ForceNet is able to predict atomic forces more accurately than state-of-the-art physics-based GNNs.
arXiv Detail & Related papers (2021-03-02T03:09:06Z) - Quantum Generative Adversarial Networks in a Continuous-Variable
Architecture to Simulate High Energy Physics Detectors [0.0]
We introduce and analyze a new prototype of quantum GAN (qGAN) employed in continuous-variable quantum computing.
Two CV qGAN models with a quantum and a classical discriminator have been tested to reproduce calorimeter outputs in a reduced size.
arXiv Detail & Related papers (2021-01-26T23:33:14Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.