Forecasting Fold Bifurcations through Physics-Informed Convolutional
Neural Networks
- URL: http://arxiv.org/abs/2312.14210v1
- Date: Thu, 21 Dec 2023 10:07:52 GMT
- Title: Forecasting Fold Bifurcations through Physics-Informed Convolutional
Neural Networks
- Authors: Giuseppe Habib and \'Ad\'am Horv\'ath
- Abstract summary: This study proposes a physics-informed convolutional neural network (CNN) for identifying dynamical systems' time series near a fold bifurcation.
The CNN is trained with a relatively small amount of data and on a single, very simple system.
A similar task requires significant extrapolation capabilities, which are obtained by exploiting physics-based information.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This study proposes a physics-informed convolutional neural network (CNN) for
identifying dynamical systems' time series near a fold bifurcation. The
peculiarity of this work is that the CNN is trained with a relatively small
amount of data and on a single, very simple system. In contrast, the CNN is
validated on much more complicated systems. A similar task requires significant
extrapolation capabilities, which are obtained by exploiting physics-based
information. Physics-based information is provided through a specific
pre-processing of the input data, consisting mostly of a transformation into
polar coordinates, normalization, transformation into the logarithmic scale,
and filtering through a moving mean. The results illustrate that such data
pre-processing enables the CNN to grasp the important features related to
approaching a fold bifurcation, namely, the trend of the oscillation amplitude,
and neglect other characteristics that are not particularly relevant, such as
the vibration frequency. The developed CNN was able to correctly classify
trajectories near a fold for a mass-on-moving-belt system, a van der
Pol-Duffing oscillator with an attached tuned mass damper, and a
pitch-and-plunge wing profile. The results obtained pave the way for the
development of similar CNNs effective in real-life applications.
Related papers
- Localized Gaussians as Self-Attention Weights for Point Clouds Correspondence [92.07601770031236]
We investigate semantically meaningful patterns in the attention heads of an encoder-only Transformer architecture.
We find that fixing the attention weights not only accelerates the training process but also enhances the stability of the optimization.
arXiv Detail & Related papers (2024-09-20T07:41:47Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Interpreting convolutional neural networks' low dimensional
approximation to quantum spin systems [1.631115063641726]
Convolutional neural networks (CNNs) have been employed along with Variational Monte Carlo methods for finding the ground state of quantum many-body spin systems.
We provide a theoretical and experimental analysis of how the CNN optimize learning for spin systems, and investigate the CNN's low dimensional approximation.
Our results allow us to gain a comprehensive, improved understanding of how CNNs successfully approximate quantum spin Hamiltonians.
arXiv Detail & Related papers (2022-10-03T02:49:16Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Estimating permeability of 3D micro-CT images by physics-informed CNNs
based on DNS [1.6274397329511197]
This paper presents a novel methodology for permeability prediction from micro-CT scans of geological rock samples.
The training data set for CNNs dedicated to permeability prediction consists of permeability labels that are typically generated by classical lattice Boltzmann methods (LBM)
We instead perform direct numerical simulation (DNS) by solving the stationary Stokes equation in an efficient and distributed-parallel manner.
arXiv Detail & Related papers (2021-09-04T08:43:19Z) - Adaptive Latent Space Tuning for Non-Stationary Distributions [62.997667081978825]
We present a method for adaptive tuning of the low-dimensional latent space of deep encoder-decoder style CNNs.
We demonstrate our approach for predicting the properties of a time-varying charged particle beam in a particle accelerator.
arXiv Detail & Related papers (2021-05-08T03:50:45Z) - Transfer Learning with Convolutional Networks for Atmospheric Parameter
Retrieval [14.131127382785973]
The Infrared Atmospheric Sounding Interferometer (IASI) on board the MetOp satellite series provides important measurements for Numerical Weather Prediction (NWP)
Retrieving accurate atmospheric parameters from the raw data provided by IASI is a large challenge, but necessary in order to use the data in NWP models.
We show how features extracted from the IASI data by a CNN trained to predict a physical variable can be used as inputs to another statistical method designed to predict a different physical variable at low altitude.
arXiv Detail & Related papers (2020-12-09T09:28:42Z) - Curriculum By Smoothing [52.08553521577014]
Convolutional Neural Networks (CNNs) have shown impressive performance in computer vision tasks such as image classification, detection, and segmentation.
We propose an elegant curriculum based scheme that smoothes the feature embedding of a CNN using anti-aliasing or low-pass filters.
As the amount of information in the feature maps increases during training, the network is able to progressively learn better representations of the data.
arXiv Detail & Related papers (2020-03-03T07:27:44Z) - The use of Convolutional Neural Networks for signal-background
classification in Particle Physics experiments [0.4301924025274017]
We present an extensive convolutional neural architecture search, achieving high accuracy for signal/background discrimination for a HEP classification use-case.
We demonstrate among other things that we can achieve the same accuracy as complex ResNet architectures with CNNs with less parameters.
arXiv Detail & Related papers (2020-02-13T19:54:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.