Forecasting Fold Bifurcations through Physics-Informed Convolutional
Neural Networks
- URL: http://arxiv.org/abs/2312.14210v1
- Date: Thu, 21 Dec 2023 10:07:52 GMT
- Title: Forecasting Fold Bifurcations through Physics-Informed Convolutional
Neural Networks
- Authors: Giuseppe Habib and \'Ad\'am Horv\'ath
- Abstract summary: This study proposes a physics-informed convolutional neural network (CNN) for identifying dynamical systems' time series near a fold bifurcation.
The CNN is trained with a relatively small amount of data and on a single, very simple system.
A similar task requires significant extrapolation capabilities, which are obtained by exploiting physics-based information.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This study proposes a physics-informed convolutional neural network (CNN) for
identifying dynamical systems' time series near a fold bifurcation. The
peculiarity of this work is that the CNN is trained with a relatively small
amount of data and on a single, very simple system. In contrast, the CNN is
validated on much more complicated systems. A similar task requires significant
extrapolation capabilities, which are obtained by exploiting physics-based
information. Physics-based information is provided through a specific
pre-processing of the input data, consisting mostly of a transformation into
polar coordinates, normalization, transformation into the logarithmic scale,
and filtering through a moving mean. The results illustrate that such data
pre-processing enables the CNN to grasp the important features related to
approaching a fold bifurcation, namely, the trend of the oscillation amplitude,
and neglect other characteristics that are not particularly relevant, such as
the vibration frequency. The developed CNN was able to correctly classify
trajectories near a fold for a mass-on-moving-belt system, a van der
Pol-Duffing oscillator with an attached tuned mass damper, and a
pitch-and-plunge wing profile. The results obtained pave the way for the
development of similar CNNs effective in real-life applications.
Related papers
- Heterogeneous quantization regularizes spiking neural network activity [0.0]
We present a data-blind neuromorphic signal conditioning strategy whereby analog data are normalized and quantized into spike phase representations.
We extend this mechanism by adding a data-aware calibration step whereby the range and density of the quantization weights adapt to accumulated input statistics.
arXiv Detail & Related papers (2024-09-27T02:25:44Z) - Learning noise-induced transitions by multi-scaling reservoir computing [2.9170682727903863]
We develop a machine learning model, reservoir computing as a type of recurrent neural network, to learn noise-induced transitions.
The trained model generates accurate statistics of transition time and the number of transitions.
It is also aware of the asymmetry of the double-well potential, the rotational dynamics caused by non-detailed balance, and transitions in multi-stable systems.
arXiv Detail & Related papers (2023-09-11T12:26:36Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Learning Flow Functions from Data with Applications to Nonlinear
Oscillators [0.0]
We show that learning the flow function is equivalent to learning the input-to-state map of a discrete-time dynamical system.
This motivates the use of an RNN together with encoder and decoder networks which map the state of the system to the hidden state of the RNN and back.
arXiv Detail & Related papers (2023-03-29T13:04:04Z) - Neuronal architecture extracts statistical temporal patterns [1.9662978733004601]
We show how higher-order temporal (co-)fluctuations can be employed to represent and process information.
A simple biologically inspired feedforward neuronal model is able to extract information from up to the third order cumulant to perform time series classification.
arXiv Detail & Related papers (2023-01-24T18:21:33Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Adaptive Machine Learning for Time-Varying Systems: Low Dimensional
Latent Space Tuning [91.3755431537592]
We present a recently developed method of adaptive machine learning for time-varying systems.
Our approach is to map very high (N>100k) dimensional inputs into the low dimensional (N2) latent space at the output of the encoder section of an encoder-decoder CNN.
This method allows us to learn correlations within and to track their evolution in real time based on feedback without interrupts.
arXiv Detail & Related papers (2021-07-13T16:05:28Z) - Adaptive Latent Space Tuning for Non-Stationary Distributions [62.997667081978825]
We present a method for adaptive tuning of the low-dimensional latent space of deep encoder-decoder style CNNs.
We demonstrate our approach for predicting the properties of a time-varying charged particle beam in a particle accelerator.
arXiv Detail & Related papers (2021-05-08T03:50:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.