Ensemble Reservoir Computing for Dynamical Systems: Prediction of
Phase-Space Stable Region for Hadron Storage Rings
- URL: http://arxiv.org/abs/2301.06786v1
- Date: Tue, 17 Jan 2023 10:29:07 GMT
- Title: Ensemble Reservoir Computing for Dynamical Systems: Prediction of
Phase-Space Stable Region for Hadron Storage Rings
- Authors: Maxime Casanova, Barbara Dalena, Luca Bonaventura, Massimo Giovannozzi
- Abstract summary: Echo State Networks (ESN) are a class of recurrent neural networks that are computationally effective.
We present the performance reached by ESN based on the prediction of the phase-space stability region.
We observe that the proposed ESN approach is capable of effectively predicting the time evolution of the extent of the dynamic aperture.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We investigate the ability of an ensemble reservoir computing approach to
predict the long-term behaviour of the phase-space region in which the motion
of charged particles in hadron storage rings is bounded, the so-called dynamic
aperture. Currently, the calculation of the phase-space stability region of
hadron storage rings is performed through direct computer simulations, which
are resource- and time-intensive processes. Echo State Networks (ESN) are a
class of recurrent neural networks that are computationally effective, since
they avoid backpropagation and require only cross-validation. Furthermore, they
have been proven to be universal approximants of dynamical systems. In this
paper, we present the performance reached by ESN based on an ensemble approach
for the prediction of the phase-space stability region and compare it with
analytical scaling laws based on the stability-time estimate of the Nekhoroshev
theorem for Hamiltonian systems. We observe that the proposed ESN approach is
capable of effectively predicting the time evolution of the extent of the
dynamic aperture, improving the predictions by analytical scaling laws, thus
providing an efficient surrogate model.
Related papers
- Entropy stable conservative flux form neural networks [3.417730578086946]
We propose an entropy-stable conservative flux form neural network (CFN) that integrates classical numerical conservation laws into a data-driven framework.
Numerical experiments demonstrate that the entropy-stable CFN achieves both stability and conservation while maintaining accuracy over extended time domains.
arXiv Detail & Related papers (2024-11-04T02:01:31Z) - Dynamical system prediction from sparse observations using deep neural networks with Voronoi tessellation and physics constraint [12.638698799995815]
We introduce the Dynamic System Prediction from Sparse Observations using Voronoi Tessellation (DSOVT) framework.
By integrating Voronoi tessellations with deep learning models, DSOVT is adept at predicting dynamical systems with sparse, unstructured observations.
Compared to purely data-driven models, our physics-based approach enables the model to learn physical laws within explicitly formulated dynamics.
arXiv Detail & Related papers (2024-08-31T13:43:52Z) - SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Reconstruction, forecasting, and stability of chaotic dynamics from
partial data [4.266376725904727]
We propose data-driven methods to infer the dynamics of hidden chaotic variables from partial observations.
We show that the proposed networks can forecast the hidden variables, both time-accurately and statistically.
This work opens new opportunities for reconstructing the full state, inferring hidden variables, and computing the stability of chaotic systems from partial data.
arXiv Detail & Related papers (2023-05-24T13:01:51Z) - Spectral learning of Bernoulli linear dynamical systems models [21.3534487101893]
We develop a learning method for fast, efficient fitting of latent linear dynamical system models.
Our approach extends traditional subspace identification methods to the Bernoulli setting.
We show that the estimator provides real world settings by analyzing data from mice performing a sensory decision-making task.
arXiv Detail & Related papers (2023-03-03T16:29:12Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Structured Optimal Variational Inference for Dynamic Latent Space Models [16.531262817315696]
We consider a latent space model for dynamic networks, where our objective is to estimate the pairwise inner products plus the intercept of the latent positions.
To balance posterior inference and computational scalability, we consider a structured mean-field variational inference framework.
arXiv Detail & Related papers (2022-09-29T22:10:42Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.