Neural oscillators for magnetic hysteresis modeling
- URL: http://arxiv.org/abs/2308.12002v1
- Date: Wed, 23 Aug 2023 08:41:24 GMT
- Title: Neural oscillators for magnetic hysteresis modeling
- Authors: Abhishek Chandra, Taniya Kapoor, Bram Daniels, Mitrofan Curti, Koen
Tiels, Daniel M. Tartakovsky, Elena A. Lomonova
- Abstract summary: Hysteresis is a ubiquitous phenomenon in science and engineering.
We develop an ordinary differential equation-based recurrent neural network (RNN) approach to model and quantify the phenomenon.
- Score: 0.7444373636055321
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hysteresis is a ubiquitous phenomenon in science and engineering; its
modeling and identification are crucial for understanding and optimizing the
behavior of various systems. We develop an ordinary differential equation-based
recurrent neural network (RNN) approach to model and quantify the hysteresis,
which manifests itself in sequentiality and history-dependence. Our neural
oscillator, HystRNN, draws inspiration from coupled-oscillatory RNN and
phenomenological hysteresis models to update the hidden states. The performance
of HystRNN is evaluated to predict generalized scenarios, involving first-order
reversal curves and minor loops. The findings show the ability of HystRNN to
generalize its behavior to previously untrained regions, an essential feature
that hysteresis models must have. This research highlights the advantage of
neural oscillators over the traditional RNN-based methods in capturing complex
hysteresis patterns in magnetic materials, where traditional rate-dependent
methods are inadequate to capture intrinsic nonlinearity.
Related papers
- Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems [2.170477444239546]
We develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS)
Our method builds on previous work modeling the latent state evolution via a differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs)
Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics.
arXiv Detail & Related papers (2024-07-19T15:32:15Z) - Inferring stochastic low-rank recurrent neural networks from neural data [5.179844449042386]
A central aim in computational neuroscience is to relate the activity of large neurons to an underlying dynamical system.
Low-rank recurrent neural networks (RNNs) exhibit such interpretability by having tractable dynamics.
Here, we propose to fit low-rank RNNs with variational sequential Monte Carlo methods.
arXiv Detail & Related papers (2024-06-24T15:57:49Z) - Novel Kernel Models and Exact Representor Theory for Neural Networks Beyond the Over-Parameterized Regime [52.00917519626559]
This paper presents two models of neural-networks and their training applicable to neural networks of arbitrary width, depth and topology.
We also present an exact novel representor theory for layer-wise neural network training with unregularized gradient descent in terms of a local-extrinsic neural kernel (LeNK)
This representor theory gives insight into the role of higher-order statistics in neural network training and the effect of kernel evolution in neural-network kernel models.
arXiv Detail & Related papers (2024-05-24T06:30:36Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Expressive architectures enhance interpretability of dynamics-based
neural population models [2.294014185517203]
We evaluate the performance of sequential autoencoders (SAEs) in recovering latent chaotic attractors from simulated neural datasets.
We found that SAEs with widely-used recurrent neural network (RNN)-based dynamics were unable to infer accurate firing rates at the true latent state dimensionality.
arXiv Detail & Related papers (2022-12-07T16:44:26Z) - Stabilized Neural Ordinary Differential Equations for Long-Time
Forecasting of Dynamical Systems [1.001737665513683]
We present a data-driven modeling method that accurately captures shocks and chaotic dynamics.
We learn the right-hand-side (SRH) of an ODE by adding the outputs of two NN together where one learns a linear term and the other a nonlinear term.
Specifically, we implement this by training a sparse linear convolutional NN to learn the linear term and a dense fully-connected nonlinear NN to learn the nonlinear term.
arXiv Detail & Related papers (2022-03-29T16:10:34Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Path classification by stochastic linear recurrent neural networks [2.5499055723658097]
We show that RNNs retain a partial signature of the paths they are fed as the unique information exploited for training and classification tasks.
We argue that these RNNs are easy to train and robust and back these observations with numerical experiments on both synthetic and real data.
arXiv Detail & Related papers (2021-08-06T12:59:12Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.