Prediction of transport property via machine learning molecular
movements
- URL: http://arxiv.org/abs/2203.03103v1
- Date: Mon, 7 Mar 2022 02:28:07 GMT
- Title: Prediction of transport property via machine learning molecular
movements
- Authors: Ikki Yasuda, Yusei Kobayashi, Katsuhiro Endo, Yoshihiro Hayakawa,
Kazuhiko Fujiwara, Kuniaki Yajima, Noriyoshi Arai, Kenji Yasuoka
- Abstract summary: We present a simple supervised machine learning method to predict the transport properties of materials.
This method was applied to predict the viscosity of lubricant molecules in confinement with shear flow.
We revealed two types of molecular mechanisms that contribute to low viscosity.
- Score: 1.0554048699217666
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Molecular dynamics (MD) simulations are increasingly being combined with
machine learning (ML) to predict material properties. The molecular
configurations obtained from MD are represented by multiple features, such as
thermodynamic properties, and are used as the ML input. However, to accurately
find the input--output patterns, ML requires a sufficiently sized dataset that
depends on the complexity of the ML model. Generating such a large dataset from
MD simulations is not ideal because of their high computation cost. In this
study, we present a simple supervised ML method to predict the transport
properties of materials. To simplify the model, an unsupervised ML method
obtains an efficient representation of molecular movements. This method was
applied to predict the viscosity of lubricant molecules in confinement with
shear flow. Furthermore, simplicity facilitates the interpretation of the model
to understand the molecular mechanics of viscosity. We revealed two types of
molecular mechanisms that contribute to low viscosity.
Related papers
- Pre-trained Molecular Language Models with Random Functional Group Masking [54.900360309677794]
We propose a SMILES-based underlineem Molecular underlineem Language underlineem Model, which randomly masking SMILES subsequences corresponding to specific molecular atoms.
This technique aims to compel the model to better infer molecular structures and properties, thus enhancing its predictive capabilities.
arXiv Detail & Related papers (2024-11-03T01:56:15Z) - Unveiling Molecular Secrets: An LLM-Augmented Linear Model for Explainable and Calibratable Molecular Property Prediction [26.25787628872043]
This work proposes a novel framework, called MoleX, to build a simple yet powerful linear model for accurate molecular property prediction.
The core of MoleX is to model complicated molecular structure-property relationships using a simple linear model, augmented by LLM knowledge and a crafted calibration strategy.
Extensive experiments demonstrate that MoleX outperforms existing methods in molecular property prediction, establishing a new milestone in predictive performance, explainability, and efficiency.
arXiv Detail & Related papers (2024-10-11T14:07:57Z) - Multi-task learning for molecular electronic structure approaching coupled-cluster accuracy [9.81014501502049]
We develop a unified machine learning method for electronic structures of organic molecules using the gold-standard CCSD(T) calculations as training data.
Tested on hydrocarbon molecules, our model outperforms DFT with the widely-used hybrid and double hybrid functionals in computational costs and prediction accuracy of various quantum chemical properties.
arXiv Detail & Related papers (2024-05-09T19:51:27Z) - Molecule Design by Latent Prompt Transformer [76.2112075557233]
This work explores the challenging problem of molecule design by framing it as a conditional generative modeling task.
We propose a novel generative model comprising three components: (1) a latent vector with a learnable prior distribution; (2) a molecule generation model based on a causal Transformer, which uses the latent vector as a prompt; and (3) a property prediction model that predicts a molecule's target properties and/or constraint values using the latent prompt.
arXiv Detail & Related papers (2024-02-27T03:33:23Z) - Molecule Design by Latent Space Energy-Based Modeling and Gradual
Distribution Shifting [53.44684898432997]
Generation of molecules with desired chemical and biological properties is critical for drug discovery.
We propose a probabilistic generative model to capture the joint distribution of molecules and their properties.
Our method achieves very strong performances on various molecule design tasks.
arXiv Detail & Related papers (2023-06-09T03:04:21Z) - Implicit Geometry and Interaction Embeddings Improve Few-Shot Molecular
Property Prediction [53.06671763877109]
We develop molecular embeddings that encode complex molecular characteristics to improve the performance of few-shot molecular property prediction.
Our approach leverages large amounts of synthetic data, namely the results of molecular docking calculations.
On multiple molecular property prediction benchmarks, training from the embedding space substantially improves Multi-Task, MAML, and Prototypical Network few-shot learning performance.
arXiv Detail & Related papers (2023-02-04T01:32:40Z) - Accurate Machine Learned Quantum-Mechanical Force Fields for
Biomolecular Simulations [51.68332623405432]
Molecular dynamics (MD) simulations allow atomistic insights into chemical and biological processes.
Recently, machine learned force fields (MLFFs) emerged as an alternative means to execute MD simulations.
This work proposes a general approach to constructing accurate MLFFs for large-scale molecular simulations.
arXiv Detail & Related papers (2022-05-17T13:08:28Z) - Designing Machine Learning Surrogates using Outputs of Molecular
Dynamics Simulations as Soft Labels [0.0]
We show that statistical uncertainties associated with the outputs of molecular dynamics simulations can be utilized to train artificial neural networks.
We design soft labels for the simulation outputs by incorporating the uncertainties in the estimated average output quantities.
The approach is illustrated with the design of a surrogate for molecular dynamics simulations of confined electrolytes.
arXiv Detail & Related papers (2021-10-27T19:00:40Z) - Do Large Scale Molecular Language Representations Capture Important
Structural Information? [31.76876206167457]
We present molecular embeddings obtained by training an efficient transformer encoder model, referred to as MoLFormer.
Experiments show that the learned molecular representation performs competitively, when compared to graph-based and fingerprint-based supervised learning baselines.
arXiv Detail & Related papers (2021-06-17T14:33:55Z) - BIGDML: Towards Exact Machine Learning Force Fields for Materials [55.944221055171276]
Machine-learning force fields (MLFF) should be accurate, computationally and data efficient, and applicable to molecules, materials, and interfaces thereof.
Here, we introduce the Bravais-Inspired Gradient-Domain Machine Learning approach and demonstrate its ability to construct reliable force fields using a training set with just 10-200 atoms.
arXiv Detail & Related papers (2021-06-08T10:14:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.