Gibbs-Helmholtz Graph Neural Network: capturing the temperature
dependency of activity coefficients at infinite dilution
- URL: http://arxiv.org/abs/2212.01199v1
- Date: Fri, 2 Dec 2022 14:25:58 GMT
- Title: Gibbs-Helmholtz Graph Neural Network: capturing the temperature
dependency of activity coefficients at infinite dilution
- Authors: Edgar Ivan Sanchez Medina, Steffen Linke, Martin Stoll, Kai Sundmacher
- Abstract summary: We develop the Gibbs-Helmholtz Graph Neural Network (GH-GNN) model for predicting $ln gamma_ijinfty$ of molecular systems at different temperatures.
We analyze the performance of GH-GNN for continuous and discrete inter/extrapolation and give indications for the model's applicability domain and expected accuracy.
- Score: 1.290382979353427
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The accurate prediction of physicochemical properties of chemical compounds
in mixtures (such as the activity coefficient at infinite dilution
$\gamma_{ij}^\infty$) is essential for developing novel and more sustainable
chemical processes. In this work, we analyze the performance of
previously-proposed GNN-based models for the prediction of
$\gamma_{ij}^\infty$, and compare them with several mechanistic models in a
series of 9 isothermal studies. Moreover, we develop the Gibbs-Helmholtz Graph
Neural Network (GH-GNN) model for predicting $\ln \gamma_{ij}^\infty$ of
molecular systems at different temperatures. Our method combines the simplicity
of a Gibbs-Helmholtz-derived expression with a series of graph neural networks
that incorporate explicit molecular and intermolecular descriptors for
capturing dispersion and hydrogen bonding effects. We have trained this model
using experimentally determined $\ln \gamma_{ij}^\infty$ data of 40,219
binary-systems involving 1032 solutes and 866 solvents, overall showing
superior performance compared to the popular UNIFAC-Dortmund model. We analyze
the performance of GH-GNN for continuous and discrete inter/extrapolation and
give indications for the model's applicability domain and expected accuracy. In
general, GH-GNN is able to produce accurate predictions for extrapolated
binary-systems if at least 25 systems with the same combination of
solute-solvent chemical classes are contained in the training set and a
similarity indicator above 0.35 is also present. This model and its
applicability domain recommendations have been made open-source at
https://github.com/edgarsmdn/GH-GNN.
Related papers
- Neural P$^3$M: A Long-Range Interaction Modeling Enhancer for Geometric
GNNs [66.98487644676906]
We introduce Neural P$3$M, a versatile enhancer of geometric GNNs to expand the scope of their capabilities.
It exhibits flexibility across a wide range of molecular systems and demonstrates remarkable accuracy in predicting energies and forces.
It also achieves an average improvement of 22% on the OE62 dataset while integrating with various architectures.
arXiv Detail & Related papers (2024-09-26T08:16:59Z) - Learning CO$_2$ plume migration in faulted reservoirs with Graph Neural
Networks [0.3914676152740142]
We develop a graph-based neural model for capturing the impact of faults on CO$$ plume migration.
We demonstrate that our approach can accurately predict the temporal evolution of gas saturation and pore pressure in a synthetic reservoir with faults.
This work highlights the potential of GNN-based methods to accurately and rapidly model subsurface flow with complex faults and fractures.
arXiv Detail & Related papers (2023-06-16T06:47:47Z) - Gibbs-Duhem-Informed Neural Networks for Binary Activity Coefficient
Prediction [45.84205238554709]
We propose Gibbs-Duhem-informed neural networks for the prediction of binary activity coefficients at varying compositions.
We include the Gibbs-Duhem equation explicitly in the loss function for training neural networks.
arXiv Detail & Related papers (2023-05-31T07:36:45Z) - Predicting CO$_2$ Absorption in Ionic Liquids with Molecular Descriptors
and Explainable Graph Neural Networks [9.04563945965023]
Liquids (ILs) provide a promising solution for CO$$ capture and storage to mitigate global warming.
In this work, we develop both fingerprint-based Machine Learning models and Graph Neural Networks (GNNs) to predict the CO$$ in ILs.
Our method outperforms previous ML models by reaching a high accuracy (MAE of 0.0137, $R2$ of 0.9884)
arXiv Detail & Related papers (2022-09-29T18:31:12Z) - Efficient Chemical Space Exploration Using Active Learning Based on
Marginalized Graph Kernel: an Application for Predicting the Thermodynamic
Properties of Alkanes with Molecular Simulation [10.339394156446982]
We use molecular dynamics simulation to generate data and graph neural network (GNN) to predict.
In specific, targeting 251,728 alkane molecules consisting of 4 to 19 carbon atoms and their liquid physical properties.
validation shows that only 313 molecules were sufficient to train an accurate GNN model with $rm R2 > 0.99$ for computational test sets and $rm R2 > 0.94$ for experimental test sets.
arXiv Detail & Related papers (2022-09-01T14:59:13Z) - MolGraph: a Python package for the implementation of molecular graphs
and graph neural networks with TensorFlow and Keras [51.92255321684027]
MolGraph is a graph neural network (GNN) package for molecular machine learning (ML)
MolGraph implements a chemistry module to accommodate the generation of small molecular graphs, which can be passed to a GNN algorithm to solve a molecular ML problem.
GNNs proved useful for molecular identification and improved interpretability of chromatographic retention time data.
arXiv Detail & Related papers (2022-08-21T18:37:41Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - Graph Neural Networks for Temperature-Dependent Activity Coefficient
Prediction of Solutes in Ionic Liquids [58.720142291102135]
We present a GNN to predict temperature-dependent infinite dilution ACs of solutes in ILs.
We train the GNN on a database including more than 40,000 AC values and compare it to a state-of-the-art MCM.
The GNN and MCM achieve similar high prediction performance, with the GNN additionally enabling high-quality predictions for ACs of solutions that contain ILs and solutes not considered during training.
arXiv Detail & Related papers (2022-06-23T15:27:29Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.