Using Restricted Boltzmann Machines to Model Molecular Geometries
- URL: http://arxiv.org/abs/2012.06984v1
- Date: Sun, 13 Dec 2020 07:02:32 GMT
- Title: Using Restricted Boltzmann Machines to Model Molecular Geometries
- Authors: Peter Nekrasov, Jessica Freeze, and Victor Batista
- Abstract summary: This paper proposes a new methodology for modeling a set of physical parameters by taking advantage of the Boltzmann machine's fast learning capacity and representational power.
In this paper we introduce a new RBM based on the Tanh activation function, and conduct a comparison of RBMs with different activation functions.
We demonstrate the ability of Gaussian RBMs to model small molecules such as water and ethane.
- Score: 0.0
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Precise physical descriptions of molecules can be obtained by solving the
Schrodinger equation; however, these calculations are intractable and even
approximations can be cumbersome. Force fields, which estimate interatomic
potentials based on empirical data, are also time-consuming. This paper
proposes a new methodology for modeling a set of physical parameters by taking
advantage of the restricted Boltzmann machine's fast learning capacity and
representational power. By training the machine on ab initio data, we can
predict new data in the distribution of molecular configurations matching the
ab initio distribution. In this paper we introduce a new RBM based on the Tanh
activation function, and conduct a comparison of RBMs with different activation
functions, including sigmoid, Gaussian, and (Leaky) ReLU. Finally we
demonstrate the ability of Gaussian RBMs to model small molecules such as water
and ethane.
Related papers
- MING: A Functional Approach to Learning Molecular Generative Models [46.189683355768736]
This paper introduces a novel paradigm for learning molecule generative models based on functional representations.
We propose Molecular Implicit Neural Generation (MING), a diffusion-based model that learns molecular distributions in function space.
arXiv Detail & Related papers (2024-10-16T13:02:02Z) - Nutmeg and SPICE: Models and Data for Biomolecular Machine Learning [1.747623282473278]
SPICE dataset is a collection of quantum chemistry calculations for training machine learning potentials.
We train a set of potential energy functions called Nutmeg on it.
arXiv Detail & Related papers (2024-06-18T23:54:21Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Molecule Design by Latent Space Energy-Based Modeling and Gradual
Distribution Shifting [53.44684898432997]
Generation of molecules with desired chemical and biological properties is critical for drug discovery.
We propose a probabilistic generative model to capture the joint distribution of molecules and their properties.
Our method achieves very strong performances on various molecule design tasks.
arXiv Detail & Related papers (2023-06-09T03:04:21Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Exploring Chemical Space with Score-based Out-of-distribution Generation [57.15855198512551]
We propose a score-based diffusion scheme that incorporates out-of-distribution control in the generative differential equation (SDE)
Since some novel molecules may not meet the basic requirements of real-world drugs, MOOD performs conditional generation by utilizing the gradients from a property predictor.
We experimentally validate that MOOD is able to explore the chemical space beyond the training distribution, generating molecules that outscore ones found with existing methods, and even the top 0.01% of the original training pool.
arXiv Detail & Related papers (2022-06-06T06:17:11Z) - Learning black- and gray-box chemotactic PDEs/closures from agent based
Monte Carlo simulation data [0.6882042556551611]
We propose a machine learning framework for the data-driven discovery of macroscopic chemotactic Partial Differential Equations (PDEs)
The fine scale, detailed, hybrid (continuum - Monte Carlo) simulation model embodies the underlying biophysics.
We learn effective, coarse-grained "Keller-Segel class" chemotactic PDEs using machine learning regressors.
arXiv Detail & Related papers (2022-05-26T03:02:49Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Prediction of liquid fuel properties using machine learning models with
Gaussian processes and probabilistic conditional generative learning [56.67751936864119]
The present work aims to construct cheap-to-compute machine learning (ML) models to act as closure equations for predicting the physical properties of alternative fuels.
Those models can be trained using the database from MD simulations and/or experimental measurements in a data-fusion-fidelity approach.
The results show that ML models can predict accurately the fuel properties of a wide range of pressure and temperature conditions.
arXiv Detail & Related papers (2021-10-18T14:43:50Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - End-to-End Differentiable Molecular Mechanics Force Field Construction [0.5269923665485903]
We propose an alternative approach that uses graph neural networks to perceive chemical environments.
The entire process is modular and end-to-end differentiable with respect to model parameters.
We show that this approach is not only sufficiently to reproduce legacy atom types, but that it can learn to accurately reproduce and extend existing molecular mechanics force fields.
arXiv Detail & Related papers (2020-10-02T20:59:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.