Magnetic Hysteresis Modeling with Neural Operators
- URL: http://arxiv.org/abs/2407.03261v2
- Date: Sun, 10 Nov 2024 14:36:15 GMT
- Title: Magnetic Hysteresis Modeling with Neural Operators
- Authors: Abhishek Chandra, Bram Daniels, Mitrofan Curti, Koen Tiels, Elena A. Lomonova,
- Abstract summary: This paper proposes neural operators for modeling laws that exhibit magnetic neural operator by learning a mapping between magnetic fields.
Three neural operators-deep operator network, Fourier and wavelet neural operator-are employed to predict novel first-order reversal curves and minor loops.
A rate-independent neural operator is proposed to predict material responses at sampling rates different from those used during training to incorporate the rate-independent characteristics of magnetic fields.
- Score: 0.7817677116789855
- License:
- Abstract: Hysteresis modeling is crucial to comprehend the behavior of magnetic devices, facilitating optimal designs. Hitherto, deep learning-based methods employed to model hysteresis, face challenges in generalizing to novel input magnetic fields. This paper addresses the generalization challenge by proposing neural operators for modeling constitutive laws that exhibit magnetic hysteresis by learning a mapping between magnetic fields. In particular, three neural operators-deep operator network, Fourier neural operator, and wavelet neural operator-are employed to predict novel first-order reversal curves and minor loops, where novel means they are not used to train the model. In addition, a rate-independent Fourier neural operator is proposed to predict material responses at sampling rates different from those used during training to incorporate the rate-independent characteristics of magnetic hysteresis. The presented numerical experiments demonstrate that neural operators efficiently model magnetic hysteresis, outperforming the traditional neural recurrent methods on various metrics and generalizing to novel magnetic fields. The findings emphasize the advantages of using neural operators for modeling hysteresis under varying magnetic conditions, underscoring their importance in characterizing magnetic material based devices. The codes related to this paper are at github.com/chandratue/magnetic_hysteresis_neural_operator.
Related papers
- Neural Operators Learn the Local Physics of Magnetohydrodynamics [6.618373975988337]
Magnetohydrodynamics (MHD) plays a pivotal role in describing the dynamics of plasma and conductive fluids.
Recent advances introduce neural operators like the Fourier Neural Operator (FNO) as surrogate models for traditional numerical analyses.
This study explores a modified Flux Fourier neural operator model to approximate the numerical flux of ideal MHD.
arXiv Detail & Related papers (2024-04-24T17:48:38Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Machine-learned models for magnetic materials [0.0]
Magnetic materials represented by multidimensional characteristics (that mimic measurements) are used to train the neural autoencoder model.
The neural model is trained to capture a synthetically generated set of characteristics that can cover a broad range of material behaviors.
We prove its usefulness in the complex problem of modeling magnetic materials in the frequency and current (out-of-linear range) domains simultaneously.
arXiv Detail & Related papers (2023-12-29T20:47:04Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - Neural oscillators for magnetic hysteresis modeling [0.7444373636055321]
Hysteresis is a ubiquitous phenomenon in science and engineering.
We develop an ordinary differential equation-based recurrent neural network (RNN) approach to model and quantify the phenomenon.
arXiv Detail & Related papers (2023-08-23T08:41:24Z) - A neural operator-based surrogate solver for free-form electromagnetic
inverse design [0.0]
We implement and train a modified Fourier neural operator as a surrogate solver for electromagnetic scattering problems.
We demonstrate its application to the gradient-based nanophotonic inverse design of free-form, fully three-dimensional electromagnetic scatterers.
arXiv Detail & Related papers (2023-02-04T07:56:18Z) - Low-Resource Music Genre Classification with Cross-Modal Neural Model
Reprogramming [129.4950757742912]
We introduce a novel method for leveraging pre-trained models for low-resource (music) classification based on the concept of Neural Model Reprogramming (NMR)
NMR aims at re-purposing a pre-trained model from a source domain to a target domain by modifying the input of a frozen pre-trained model.
Experimental results suggest that a neural model pre-trained on large-scale datasets can successfully perform music genre classification by using this reprogramming method.
arXiv Detail & Related papers (2022-11-02T17:38:33Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Controllable reset behavior in domain wall-magnetic tunnel junction
artificial neurons for task-adaptable computation [1.4505273244528207]
Domain wall-magnetic tunnel junction (DW-MTJ) devices have been shown to be able to intrinsically capture biological neuron behavior.
We show that edgy-relaxed behavior can be implemented in DW-MTJ artificial neurons via three alternative mechanisms.
arXiv Detail & Related papers (2021-01-08T16:50:29Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.