Capturing dynamical correlations using implicit neural representations
- URL: http://arxiv.org/abs/2304.03949v1
- Date: Sat, 8 Apr 2023 07:55:36 GMT
- Title: Capturing dynamical correlations using implicit neural representations
- Authors: Sathya Chitturi, Zhurun Ji, Alexander Petsch, Cheng Peng, Zhantao
Chen, Rajan Plumley, Mike Dunne, Sougata Mardanya, Sugata Chowdhury, Hongwei
Chen, Arun Bansil, Adrian Feiguin, Alexander Kolesnikov, Dharmalingam
Prabhakaran, Stephen Hayden, Daniel Ratner, Chunjing Jia, Youssef Nashed,
Joshua Turner
- Abstract summary: We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
- Score: 85.66456606776552
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The observation and description of collective excitations in solids is a
fundamental issue when seeking to understand the physics of a many-body system.
Analysis of these excitations is usually carried out by measuring the dynamical
structure factor, S(Q, $\omega$), with inelastic neutron or x-ray scattering
techniques and comparing this against a calculated dynamical model. Here, we
develop an artificial intelligence framework which combines a neural network
trained to mimic simulated data from a model Hamiltonian with automatic
differentiation to recover unknown parameters from experimental data. We
benchmark this approach on a Linear Spin Wave Theory (LSWT) simulator and
advanced inelastic neutron scattering data from the square-lattice spin-1
antiferromagnet La$_2$NiO$_4$. We find that the model predicts the unknown
parameters with excellent agreement relative to analytical fitting. In doing
so, we illustrate the ability to build and train a differentiable model only
once, which then can be applied in real-time to multi-dimensional scattering
data, without the need for human-guided peak finding and fitting algorithms.
This prototypical approach promises a new technology for this field to
automatically detect and refine more advanced models for ordered quantum
systems.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Inferring stochastic low-rank recurrent neural networks from neural data [5.179844449042386]
A central aim in computational neuroscience is to relate the activity of large neurons to an underlying dynamical system.
Low-rank recurrent neural networks (RNNs) exhibit such interpretability by having tractable dynamics.
Here, we propose to fit low-rank RNNs with variational sequential Monte Carlo methods.
arXiv Detail & Related papers (2024-06-24T15:57:49Z) - HyperSINDy: Deep Generative Modeling of Nonlinear Stochastic Governing
Equations [5.279268784803583]
We introduce HyperSINDy, a framework for modeling dynamics via a deep generative model of sparse governing equations from data.
Once trained, HyperSINDy generates dynamics via a differential equation whose coefficients are driven by a white noise.
In experiments, HyperSINDy recovers ground truth governing equations, with learnedity scaling to match that of the data.
arXiv Detail & Related papers (2023-10-07T14:41:59Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Learning continuous models for continuous physics [94.42705784823997]
We develop a test based on numerical analysis theory to validate machine learning models for science and engineering applications.
Our results illustrate how principled numerical analysis methods can be coupled with existing ML training/testing methodologies to validate models for science and engineering applications.
arXiv Detail & Related papers (2022-02-17T07:56:46Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Uncertainty quantification in a mechanical submodel driven by a
Wasserstein-GAN [0.0]
We show that the use of non-linear techniques in machine learning and data-driven methods is highly relevant.
Generative Adversarial Networks (GANs) are suited for such applications, where the Wasserstein-GAN with gradient penalty variant offers improved results.
arXiv Detail & Related papers (2021-10-26T13:18:06Z) - Extracting Governing Laws from Sample Path Data of Non-Gaussian
Stochastic Dynamical Systems [4.527698247742305]
We infer equations with non-Gaussian L'evy noise from available data to reasonably predict dynamical behaviors.
We establish a theoretical framework and design a numerical algorithm to compute the asymmetric L'evy jump measure, drift and diffusion.
This method will become an effective tool in discovering the governing laws from available data sets and in understanding the mechanisms underlying complex random phenomena.
arXiv Detail & Related papers (2021-07-21T14:50:36Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Combining data assimilation and machine learning to emulate a dynamical
model from sparse and noisy observations: a case study with the Lorenz 96
model [0.0]
The method consists in applying iteratively a data assimilation step, here an ensemble Kalman filter, and a neural network.
Data assimilation is used to optimally combine a surrogate model with sparse data.
The output analysis is spatially complete and is used as a training set by the neural network to update the surrogate model.
Numerical experiments have been carried out using the chaotic 40-variables Lorenz 96 model, proving both convergence and statistical skill of the proposed hybrid approach.
arXiv Detail & Related papers (2020-01-06T12:26:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.