SpinMultiNet: Neural Network Potential Incorporating Spin Degrees of Freedom with Multi-Task Learning
- URL: http://arxiv.org/abs/2409.03253v2
- Date: Sun, 8 Sep 2024 23:58:44 GMT
- Title: SpinMultiNet: Neural Network Potential Incorporating Spin Degrees of Freedom with Multi-Task Learning
- Authors: Koki Ueno, Satoru Ohuchi, Kazuhide Ichikawa, Kei Amii, Kensuke Wakasugi,
- Abstract summary: This study introduces SpinMultiNet, a novel NNP model that integrates spin degrees of freedom through multi-task learning.
SpinMultiNet achieves accurate predictions without relying on correct spin values obtained from DFT calculations.
validation on a dataset of transition metal oxides demonstrates the high predictive accuracy of SpinMultiNet.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Network Potentials (NNPs) have attracted significant attention as a method for accelerating density functional theory (DFT) calculations. However, conventional NNP models typically do not incorporate spin degrees of freedom, limiting their applicability to systems where spin states critically influence material properties, such as transition metal oxides. This study introduces SpinMultiNet, a novel NNP model that integrates spin degrees of freedom through multi-task learning. SpinMultiNet achieves accurate predictions without relying on correct spin values obtained from DFT calculations. Instead, it utilizes initial spin estimates as input and leverages multi-task learning to optimize the spin latent representation while maintaining both $E(3)$ and time-reversal equivariance. Validation on a dataset of transition metal oxides demonstrates the high predictive accuracy of SpinMultiNet. The model successfully reproduces the energy ordering of stable spin configurations originating from superexchange interactions and accurately captures the rhombohedral distortion of the rocksalt structure. These results pave the way for new possibilities in materials simulations that consider spin degrees of freedom, promising future applications in large-scale simulations of various material systems, including magnetic materials.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Neural Networks as Spin Models: From Glass to Hidden Order Through Training [0.0]
We explore a one-to-one correspondence between a neural network (NN) and a statistical mechanical spin model.
We study the magnetic phases and the melting transition temperature as training progresses.
arXiv Detail & Related papers (2024-08-12T18:01:04Z) - Microscopic understanding of NMR signals by dynamic mean-field theory for spins [0.0]
We dub this versatile approach non-local spinDMFT (nl-spinDMFT)
A recently developed dynamic mean-field theory for disordered spins (spinDMFT) is shown to capture the spin dynamics of nuclear spins very well.
arXiv Detail & Related papers (2024-03-15T16:54:46Z) - Guiding Diamond Spin Qubit Growth with Computational Methods [14.693424479293737]
We show the use of theoretical calculations of electronic central spin decoherence as an integral part of an NV-spin bath workflow.
We then build a maximum likelihood estimator with our theoretical model, enabling the characterization of a test sample.
arXiv Detail & Related papers (2023-08-17T15:53:42Z) - Neural Functional Transformers [99.98750156515437]
This paper uses the attention mechanism to define a novel set of permutation equivariant weight-space layers called neural functional Transformers (NFTs)
NFTs respect weight-space permutation symmetries while incorporating the advantages of attention, which have exhibited remarkable success across multiple domains.
We also leverage NFTs to develop Inr2Array, a novel method for computing permutation invariant representations from the weights of implicit neural representations (INRs)
arXiv Detail & Related papers (2023-05-22T23:38:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - General time-reversal equivariant neural network potential for magnetic
materials [5.334610924852583]
This study introduces time-reversal E(3)-equivariant neural network and SpinGNN++ framework for constructing a comprehensive interatomic potential for magnetic systems.
SpinGNN++ integrates spin equivariant neural network with explicit spin-lattice terms, including Heisenberg, Dzyaloshinskii-Moriya, Kitaev, single-ion anisotropy, and biquadratic interactions.
SpinGNN++ identifies a new ferrimagnetic state as the ground magnetic state for monolayer CrTe2.
arXiv Detail & Related papers (2022-11-21T12:25:58Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Spin-Dependent Graph Neural Network Potential for Magnetic Materials [5.775111970429336]
SpinGNN is a spin-dependent interatomic potential approach that employs the graph neural network (GNN) to describe magnetic systems.
The effectiveness of SpinGNN is demonstrated by its exceptional precision in fitting a high-order spin Hamiltonian and two complex spin-lattice Hamiltonians.
arXiv Detail & Related papers (2022-03-06T01:54:50Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.