Property prediction for ionic liquids without prior structural knowledge using limited experimental data: A data-driven neural recommender system leveraging transfer learning
- URL: http://arxiv.org/abs/2509.10273v1
- Date: Fri, 12 Sep 2025 14:13:31 GMT
- Title: Property prediction for ionic liquids without prior structural knowledge using limited experimental data: A data-driven neural recommender system leveraging transfer learning
- Authors: Sahil Sethi, Kai Sundmacher, Caroline Ganzer,
- Abstract summary: Ionic liquids (ILs) have emerged as versatile replacements for traditional solvents.<n> accurately predicting key thermophysical properties remains challenging due to the vast chemical design space.<n>We present a data-driven transfer learning framework that enables reliable property prediction for ILs using sparse experimental datasets.
- Score: 0.34410212782758043
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Ionic liquids (ILs) have emerged as versatile replacements for traditional solvents because their physicochemical properties can be precisely tailored to various applications. However, accurately predicting key thermophysical properties remains challenging due to the vast chemical design space and the limited availability of experimental data. In this study, we present a data-driven transfer learning framework that leverages a neural recommender system (NRS) to enable reliable property prediction for ILs using sparse experimental datasets. The approach involves a two-stage process: first, pre-training NRS models on COSMO-RS-based simulated data at fixed temperature and pressure to learn property-specific structural embeddings for cations and anions; and second, fine-tuning simple feedforward neural networks using these embeddings with experimental data at varying temperatures and pressures. In this work, five essential IL properties are considered: density, viscosity, surface tension, heat capacity, and melting point. The framework supports both within-property and cross-property knowledge transfer. Notably, pre-trained models for density, viscosity, and heat capacity are used to fine-tune models for all five target properties, achieving improved performance by a substantial margin for four of them. The model exhibits robust extrapolation to previously unseen ILs. Moreover, the final trained models enable property prediction for over 700,000 IL combinations, offering a scalable solution for IL screening in process design. This work highlights the effectiveness of combining simulated data and transfer learning to overcome sparsity in the experimental data.
Related papers
- Refining Machine Learning Potentials through Thermodynamic Theory of Phase Transitions [0.0]
This work proposes a fine-tuning strategy via top-down learning to correct wrongly predicted transition temperatures.<n>We demonstrate that our approach can accurately correct the phase diagram of pure Titanium in a pressure range of up to 5 GPa.<n>Our approach is model-agnostic, applicable to multi-component systems with solid-solid and solid-liquid transitions.
arXiv Detail & Related papers (2025-12-03T17:06:26Z) - Parameter-Efficient Conditioning for Material Generalization in Graph-Based Simulators [2.504298819189614]
Graph network-based simulators (GNS) have demonstrated strong potential for learning particle-based physics.<n>Existing models are typically trained for a single material type and fail to generalize across distinct behaviors.<n>We propose a parameter-efficient conditioning mechanism that makes the GNS model adaptive to material parameters.
arXiv Detail & Related papers (2025-11-07T17:55:35Z) - Foundation Models for Discovery and Exploration in Chemical Space [57.97784111110166]
MIST is a family of molecular foundation models trained on large unlabeled datasets.<n>We demonstrate the ability of these models to solve real-world problems across chemical space.
arXiv Detail & Related papers (2025-10-20T17:56:01Z) - Fusing CFD and measurement data using transfer learning [49.1574468325115]
We introduce a non-linear method based on neural networks combining simulation and measurement data via transfer learning.<n>In a first step, the neural network is trained on simulation data to learn spatial features of the distributed quantities.<n>The second step involves transfer learning on the measurement data to correct for systematic errors between simulation and measurement by only re-training a small subset of the entire neural network model.
arXiv Detail & Related papers (2025-07-28T07:21:46Z) - Machine Learning for Improved Density Functional Theory Thermodynamics [0.0]
We present a machine learning (ML) approach to systematically correct intrinsic energy resolution errors in density functional theory calculations.<n>A neural network model has been trained to predict the discrepancy between DFT-calculated and experimentally measured enthalpies for binary and ternary alloys and compounds.<n>We illustrate the effectiveness of this method by applying it to the Al-Ni-Pd and Al-Ni-Ti systems, which are of interest for high-temperature applications in aerospace and protective coatings.
arXiv Detail & Related papers (2025-03-07T15:46:30Z) - Transfer Learning for Deep Learning-based Prediction of Lattice Thermal Conductivity [0.0]
We study the impact of transfer learning on the precision and generalizability of a deep learning model (ParAIsite)<n>We show that a much greater improvement is obtained when first fine-tuning it on a large datasets of low-quality approximations of lattice thermal conductivity (LTC)<n>The promising results pave the way towards a greater ability to explore large databases in search of low thermal conductivity materials.
arXiv Detail & Related papers (2024-11-27T11:57:58Z) - Predicting ionic conductivity in solids from the machine-learned potential energy landscape [68.25662704255433]
We propose an approach for the quick and reliable screening of ionic conductors through the analysis of a universal interatomic potential.<n>Eight out of the ten highest-ranked materials are confirmed to be superionic at room temperature in first-principles calculations.<n>Our method achieves a speed-up factor of approximately 50 compared to molecular dynamics driven by a machine-learning potential, and is at least 3,000 times faster compared to first-principles molecular dynamics.
arXiv Detail & Related papers (2024-11-11T09:01:36Z) - Generalizable Prediction Model of Molten Salt Mixture Density with Chemistry-Informed Transfer Learning [2.251726366940184]
Optimally designing molten salt applications requires knowledge of their thermophysical properties.
A transfer learning approach using deep neural networks (DNNs) is proposed, combining Redlich-Kister models, experimental data, and ab initio properties.
The approach predicts molten salt density with high accuracy ($r2$ > 0.99, MAPE 1%), outperforming the alternatives.
arXiv Detail & Related papers (2024-10-19T14:28:46Z) - Machine learning enabled experimental design and parameter estimation
for ultrafast spin dynamics [54.172707311728885]
We introduce a methodology that combines machine learning with Bayesian optimal experimental design (BOED)
Our method employs a neural network model for large-scale spin dynamics simulations for precise distribution and utility calculations in BOED.
Our numerical benchmarks demonstrate the superior performance of our method in guiding XPFS experiments, predicting model parameters, and yielding more informative measurements within limited experimental time.
arXiv Detail & Related papers (2023-06-03T06:19:20Z) - Hybrid full-field thermal characterization of additive manufacturing
processes using physics-informed neural networks with data [5.653328302363391]
We develop a hybrid physics-based data-driven thermal modeling approach of AM processes using physics-informed neural networks.
Partially observed temperature data measured from an infrared camera is combined with the physics laws to predict full-field temperature history.
Results show that the hybrid thermal model can effectively identify unknown parameters and capture the full-field temperature accurately.
arXiv Detail & Related papers (2022-06-15T18:27:10Z) - Pre-training via Denoising for Molecular Property Prediction [53.409242538744444]
We describe a pre-training technique that utilizes large datasets of 3D molecular structures at equilibrium.
Inspired by recent advances in noise regularization, our pre-training objective is based on denoising.
arXiv Detail & Related papers (2022-05-31T22:28:34Z) - Prediction of liquid fuel properties using machine learning models with
Gaussian processes and probabilistic conditional generative learning [56.67751936864119]
The present work aims to construct cheap-to-compute machine learning (ML) models to act as closure equations for predicting the physical properties of alternative fuels.
Those models can be trained using the database from MD simulations and/or experimental measurements in a data-fusion-fidelity approach.
The results show that ML models can predict accurately the fuel properties of a wide range of pressure and temperature conditions.
arXiv Detail & Related papers (2021-10-18T14:43:50Z) - Efficient training of lightweight neural networks using Online
Self-Acquired Knowledge Distillation [51.66271681532262]
Online Self-Acquired Knowledge Distillation (OSAKD) is proposed, aiming to improve the performance of any deep neural model in an online manner.
We utilize k-nn non-parametric density estimation technique for estimating the unknown probability distributions of the data samples in the output feature space.
arXiv Detail & Related papers (2021-08-26T14:01:04Z) - Learning the exchange-correlation functional from nature with fully
differentiable density functional theory [0.0]
We train a neural network to replace the exchange-correlation functional within a fully-differentiable three-dimensional Kohn-Sham density functional theory framework.
Our trained exchange-correlation network provided improved prediction of atomization and ionization energies across a collection of 110 molecules.
arXiv Detail & Related papers (2021-02-08T14:25:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.