Physics-constrained Bayesian inference of state functions in classical
density-functional theory
- URL: http://arxiv.org/abs/2010.03374v4
- Date: Wed, 23 Jun 2021 10:28:00 GMT
- Title: Physics-constrained Bayesian inference of state functions in classical
density-functional theory
- Authors: Peter Yatsyshin, Serafim Kalliadasis and Andrew B. Duncan
- Abstract summary: We develop a novel data-driven approach to the inverse problem of classical statistical mechanics.
We develop an efficient learning algorithm which characterises the construction of approximate free energy functionals.
We consider excluded volume particle interactions, which are ubiquitous in nature, whilst being highly challenging for modelling in terms of free energy.
- Score: 0.6445605125467573
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop a novel data-driven approach to the inverse problem of classical
statistical mechanics: given experimental data on the collective motion of a
classical many-body system, how does one characterise the free energy landscape
of that system? By combining non-parametric Bayesian inference with
physically-motivated constraints, we develop an efficient learning algorithm
which automates the construction of approximate free energy functionals. In
contrast to optimisation-based machine learning approaches, which seek to
minimise a cost function, the central idea of the proposed Bayesian inference
is to propagate a set of prior assumptions through the model, derived from
physical principles. The experimental data is used to probabilistically weigh
the possible model predictions. This naturally leads to humanly interpretable
algorithms with full uncertainty quantification of predictions. In our case,
the output of the learning algorithm is a probability distribution over a
family of free energy functionals, consistent with the observed particle data.
We find that surprisingly small data samples contain sufficient information for
inferring highly accurate analytic expressions of the underlying free energy
functionals, making our algorithm highly data efficient. We consider excluded
volume particle interactions, which are ubiquitous in nature, whilst being
highly challenging for modelling in terms of free energy. To validate our
approach we consider the paradigmatic case of one-dimensional fluid and develop
inference algorithms for the canonical and grand-canonical
statistical-mechanical ensembles. Extensions to higher-dimensional systems are
conceptually straightforward, whilst standard coarse-graining techniques allow
one to easily incorporate attractive interactions.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Statistical Mechanics of Dynamical System Identification [3.1484174280822845]
We develop a statistical mechanical approach to analyze sparse equation discovery algorithms.
In this framework, statistical mechanics offers tools to analyze the interplay between complexity and fitness.
arXiv Detail & Related papers (2024-03-04T04:32:28Z) - Unveiling the nonclassicality within quasi-distribution representations through deep learning [1.130790932059036]
A widely adopted approach focuses on the negative values of a quasi-distribution representation as compelling evidence of nonclassicality.
Here we propose a computational approach utilizing a deep generative model, processing three marginals, to construct the joint quasi-distribution functions.
Our approach also provides a significant reduction of the experimental efforts of constructing the Wigner functions of quantum states.
arXiv Detail & Related papers (2023-12-26T13:56:07Z) - Efficient Model-Free Exploration in Low-Rank MDPs [76.87340323826945]
Low-Rank Markov Decision Processes offer a simple, yet expressive framework for RL with function approximation.
Existing algorithms are either (1) computationally intractable, or (2) reliant upon restrictive statistical assumptions.
We propose the first provably sample-efficient algorithm for exploration in Low-Rank MDPs.
arXiv Detail & Related papers (2023-07-08T15:41:48Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - D4FT: A Deep Learning Approach to Kohn-Sham Density Functional Theory [79.50644650795012]
We propose a deep learning approach to solve Kohn-Sham Density Functional Theory (KS-DFT)
We prove that such an approach has the same expressivity as the SCF method, yet reduces the computational complexity.
In addition, we show that our approach enables us to explore more complex neural-based wave functions.
arXiv Detail & Related papers (2023-03-01T10:38:10Z) - A Free Lunch with Influence Functions? Improving Neural Network
Estimates with Concepts from Semiparametric Statistics [41.99023989695363]
We explore the potential for semiparametric theory to be used to improve neural networks and machine learning algorithms.
We propose a new neural network method MultiNet, which seeks the flexibility and diversity of an ensemble using a single architecture.
arXiv Detail & Related papers (2022-02-18T09:35:51Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Physics Informed Deep Kernel Learning [24.033468062984458]
Physics Informed Deep Kernel Learning (PI-DKL) exploits physics knowledge represented by differential equations with latent sources.
For efficient and effective inference, we marginalize out the latent variables and derive a collapsed model evidence lower bound (ELBO)
arXiv Detail & Related papers (2020-06-08T22:43:31Z) - Incorporating physical constraints in a deep probabilistic machine
learning framework for coarse-graining dynamical systems [7.6146285961466]
This paper offers a data-based, probablistic perspective that enables the quantification of predictive uncertainties.
We formulate the coarse-graining process by employing a probabilistic state-space model.
It is capable of reconstructing the evolution of the full, fine-scale system.
arXiv Detail & Related papers (2019-12-30T16:07:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.