A machine-learned expression for the excess Gibbs energy
- URL: http://arxiv.org/abs/2509.06484v1
- Date: Mon, 08 Sep 2025 09:47:03 GMT
- Title: A machine-learned expression for the excess Gibbs energy
- Authors: Marco Hoffmann, Thomas Specht, Quirin Göttl, Jakob Burger, Stephan Mandt, Hans Hasse, Fabian Jirasek,
- Abstract summary: The excess Gibbs energy plays a central role in chemical engineering and chemistry, providing a basis for modeling the thermodynamic properties of liquid mixtures.<n>In this work, we address this challenge by integrating physical laws as hard constraints within a flexible neural network.<n>The resulting model, HANNA, was trained end-to-end on an extensive experimental dataset for binary mixtures from the Dortmund Data Bank.
- Score: 15.799043135621565
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The excess Gibbs energy plays a central role in chemical engineering and chemistry, providing a basis for modeling the thermodynamic properties of liquid mixtures. Predicting the excess Gibbs energy of multi-component mixtures solely from the molecular structures of their components is a long-standing challenge. In this work, we address this challenge by integrating physical laws as hard constraints within a flexible neural network. The resulting model, HANNA, was trained end-to-end on an extensive experimental dataset for binary mixtures from the Dortmund Data Bank, guaranteeing thermodynamically consistent predictions. A novel surrogate solver developed in this work enabled the inclusion of liquid-liquid equilibrium data in the training process. Furthermore, a geometric projection method was applied to enable robust extrapolations to multi-component mixtures, without requiring additional parameters. We demonstrate that HANNA delivers excellent predictions, clearly outperforming state-of-the-art benchmark methods in accuracy and scope. The trained model and corresponding code are openly available, and an interactive interface is provided on our website, MLPROP.
Related papers
- Clapeyron Neural Networks for Single-Species Vapor-Liquid Equilibria [39.691553958657764]
Machine learning approaches have shown promising results for predicting molecular properties relevant for chemical process design.<n>We propose a thermodynamics-informed ML, incorporating thermodynamic relations into the loss function as regularization term for training.<n>We find improved prediction accuracy of the Clapeyron-GNN compared to the single-task learning setting, and improved approximation of the Clapeyron equation compared to the purely data-driven multi-task learning setting.
arXiv Detail & Related papers (2026-02-20T16:11:42Z) - BITS for GAPS: Bayesian Information-Theoretic Sampling for hierarchical GAussian Process Surrogates [45.88028371034407]
We introduce the Bayesian Information-Theoretic Sampling for hierarchical GAussian Process Surrogates (BITS for GAPS) framework.<n>BITS for GAPS supports serial hybrid modeling, where known physics governs part of the system.<n>We derive entropy-based acquisition functions that quantify expected information gain from candidate input locations.
arXiv Detail & Related papers (2025-11-20T21:36:21Z) - Foundation Models for Discovery and Exploration in Chemical Space [57.97784111110166]
MIST is a family of molecular foundation models trained on large unlabeled datasets.<n>We demonstrate the ability of these models to solve real-world problems across chemical space.
arXiv Detail & Related papers (2025-10-20T17:56:01Z) - KITINet: Kinetics Theory Inspired Network Architectures with PDE Simulation Approaches [43.872190335490515]
This paper introduces KITINet, a novel architecture that reinterprets feature propagation through the lens of non-equilibrium particle dynamics.<n>At its core, we propose a residual module that models update as the evolution of a particle system.<n>This formulation mimics particle collisions and energy exchange, enabling adaptive feature refinement via physics-informed interactions.
arXiv Detail & Related papers (2025-05-23T13:58:29Z) - Efficient mapping of phase diagrams with conditional Boltzmann Generators [4.437335677401287]
We develop deep generative machine learning models based on the Boltzmann Generator approach for entire phase diagrams.
By training a single normalizing flow to transform the equilibrium distribution sampled at only one reference thermodynamic state to a wide range of target temperatures and pressures, we can efficiently generate equilibrium samples.
We demonstrate our approach by predicting the solid-liquid coexistence line for a Lennard-Jones system in excellent agreement with state-of-the-art free energy methods.
arXiv Detail & Related papers (2024-06-18T08:05:04Z) - Enhanced sampling of robust molecular datasets with uncertainty-based
collective variables [0.0]
We propose a method that leverages uncertainty as the collective variable (CV) to guide the acquisition of chemically-relevant data points.
This approach employs a Gaussian Mixture Model-based uncertainty metric from a single model as the CV for biased molecular dynamics simulations.
arXiv Detail & Related papers (2024-02-06T06:42:51Z) - A Posteriori Evaluation of a Physics-Constrained Neural Ordinary
Differential Equations Approach Coupled with CFD Solver for Modeling Stiff
Chemical Kinetics [4.125745341349071]
We extend the NeuralODE framework for stiff chemical kinetics by incorporating mass conservation constraints directly into the loss function during training.
This ensures that the total mass and the elemental mass are conserved, a critical requirement for reliable downstream integration with CFD solvers.
arXiv Detail & Related papers (2023-11-22T22:40:49Z) - Differentiable Modeling and Optimization of Battery Electrolyte Mixtures
Using Geometric Deep Learning [0.3141085922386211]
We develop a differentiable geometric deep learning model for chemical mixtures, DiffMix, which is applied in guiding robotic experimentation.
We show improved prediction accuracy and model robustness of DiffMix than its purely data-driven variants.
With a robotic experimentation setup, Clio, we improve ionic conductivity of electrolytes by over 18.8% within 10 experimental steps.
arXiv Detail & Related papers (2023-10-03T22:26:38Z) - Gibbs-Duhem-Informed Neural Networks for Binary Activity Coefficient
Prediction [45.84205238554709]
We propose Gibbs-Duhem-informed neural networks for the prediction of binary activity coefficients at varying compositions.
We include the Gibbs-Duhem equation explicitly in the loss function for training neural networks.
arXiv Detail & Related papers (2023-05-31T07:36:45Z) - Hybrid quantum physics-informed neural networks for simulating computational fluid dynamics in complex shapes [37.69303106863453]
We present a hybrid quantum physics-informed neural network that simulates laminar fluid flows in 3D Y-shaped mixers.
Our approach combines the expressive power of a quantum model with the flexibility of a physics-informed neural network, resulting in a 21% higher accuracy compared to a purely classical neural network.
arXiv Detail & Related papers (2023-04-21T20:49:29Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Physics-informed machine learning with differentiable programming for
heterogeneous underground reservoir pressure management [64.17887333976593]
Avoiding over-pressurization in subsurface reservoirs is critical for applications like CO2 sequestration and wastewater injection.
Managing the pressures by controlling injection/extraction are challenging because of complex heterogeneity in the subsurface.
We use differentiable programming with a full-physics model and machine learning to determine the fluid extraction rates that prevent over-pressurization.
arXiv Detail & Related papers (2022-06-21T20:38:13Z) - Prediction of liquid fuel properties using machine learning models with
Gaussian processes and probabilistic conditional generative learning [56.67751936864119]
The present work aims to construct cheap-to-compute machine learning (ML) models to act as closure equations for predicting the physical properties of alternative fuels.
Those models can be trained using the database from MD simulations and/or experimental measurements in a data-fusion-fidelity approach.
The results show that ML models can predict accurately the fuel properties of a wide range of pressure and temperature conditions.
arXiv Detail & Related papers (2021-10-18T14:43:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.