Graph Neural Networks for Temperature-Dependent Activity Coefficient
Prediction of Solutes in Ionic Liquids
- URL: http://arxiv.org/abs/2206.11776v1
- Date: Thu, 23 Jun 2022 15:27:29 GMT
- Title: Graph Neural Networks for Temperature-Dependent Activity Coefficient
Prediction of Solutes in Ionic Liquids
- Authors: Jan G. Rittig, Karim Ben Hicham, Artur M. Schweidtmann, Manuel Dahmen,
Alexander Mitsos
- Abstract summary: We present a GNN to predict temperature-dependent infinite dilution ACs of solutes in ILs.
We train the GNN on a database including more than 40,000 AC values and compare it to a state-of-the-art MCM.
The GNN and MCM achieve similar high prediction performance, with the GNN additionally enabling high-quality predictions for ACs of solutions that contain ILs and solutes not considered during training.
- Score: 58.720142291102135
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Ionic liquids (ILs) are important solvents for sustainable processes and
predicting activity coefficients (ACs) of solutes in ILs is needed. Recently,
matrix completion methods (MCMs), transformers, and graph neural networks
(GNNs) have shown high accuracy in predicting ACs of binary mixtures, superior
to well-established models, e.g., COSMO-RS and UNIFAC. GNNs are particularly
promising here as they learn a molecular graph-to-property relationship without
pretraining, typically required for transformers, and are, unlike MCMs,
applicable to molecules not included in training. For ILs, however, GNN
applications are currently missing. Herein, we present a GNN to predict
temperature-dependent infinite dilution ACs of solutes in ILs. We train the GNN
on a database including more than 40,000 AC values and compare it to a
state-of-the-art MCM. The GNN and MCM achieve similar high prediction
performance, with the GNN additionally enabling high-quality predictions for
ACs of solutions that contain ILs and solutes not considered during training.
Related papers
- Graph Neural Networks for Surfactant Multi-Property Prediction [38.39977540117143]
Graph Neural Networks (GNNs) have exhibited a great predictive performance for property prediction of ionic liquids, polymers and drugs in general.
We create the largest available CMC database with 429 molecules and the first large data collection for surface excess concentration.
GNN yields highly accurate predictions for CMC, showing great potential for future industrial applications.
arXiv Detail & Related papers (2024-01-03T18:32:25Z) - ChiENN: Embracing Molecular Chirality with Graph Neural Networks [10.19088492223333]
We propose a theoretically justified message-passing scheme, which makes GNNs sensitive to the order of node neighbors.
We apply that concept in the context of molecular chirality to construct Chiral Edge Neural Network layer which can be appended to any GNN model.
Our experiments show that adding ChiENN layers to a GNN outperforms current state-of-the-art methods in chiral-sensitive molecular property prediction tasks.
arXiv Detail & Related papers (2023-07-05T10:50:40Z) - Denoise Pretraining on Nonequilibrium Molecules for Accurate and
Transferable Neural Potentials [8.048439531116367]
We propose denoise pretraining on nonequilibrium molecular conformations to achieve more accurate and transferable GNN potential predictions.
Our models pretrained on small molecules demonstrate remarkable transferability, improving performance when fine-tuned on diverse molecular systems.
arXiv Detail & Related papers (2023-03-03T21:15:22Z) - Gibbs-Helmholtz Graph Neural Network: capturing the temperature
dependency of activity coefficients at infinite dilution [1.290382979353427]
We develop the Gibbs-Helmholtz Graph Neural Network (GH-GNN) model for predicting $ln gamma_ijinfty$ of molecular systems at different temperatures.
We analyze the performance of GH-GNN for continuous and discrete inter/extrapolation and give indications for the model's applicability domain and expected accuracy.
arXiv Detail & Related papers (2022-12-02T14:25:58Z) - Multi-Task Mixture Density Graph Neural Networks for Predicting Cu-based
Single-Atom Alloy Catalysts for CO2 Reduction Reaction [61.9212585617803]
Graph neural networks (GNNs) have drawn more and more attention from material scientists.
We develop a multi-task (MT) architecture based on DimeNet++ and mixture density networks to improve the performance of such task.
arXiv Detail & Related papers (2022-09-15T13:52:15Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - Chemical-Reaction-Aware Molecule Representation Learning [88.79052749877334]
We propose using chemical reactions to assist learning molecule representation.
Our approach is proven effective to 1) keep the embedding space well-organized and 2) improve the generalization ability of molecule embeddings.
Experimental results demonstrate that our method achieves state-of-the-art performance in a variety of downstream tasks.
arXiv Detail & Related papers (2021-09-21T00:08:43Z) - Improving Molecular Graph Neural Network Explainability with
Orthonormalization and Induced Sparsity [0.0]
We propose two simple regularization techniques to apply during the training of GCNNs.
BRO encourages graph convolution operations to generate orthonormal node embeddings.
Gini regularization is applied to the weights of the output layer and constrains the number of dimensions the model can use to make predictions.
arXiv Detail & Related papers (2021-05-11T08:13:34Z) - Assessing Graph-based Deep Learning Models for Predicting Flash Point [52.931492216239995]
Graph-based deep learning (GBDL) models were implemented in predicting flash point for the first time.
Average R2 and Mean Absolute Error (MAE) scores of MPNN are, respectively, 2.3% lower and 2.0 K higher than previous comparable studies.
arXiv Detail & Related papers (2020-02-26T06:10:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.