An Explainable AI Model for Binary LJ Fluids
- URL: http://arxiv.org/abs/2502.17357v1
- Date: Mon, 24 Feb 2025 17:35:01 GMT
- Title: An Explainable AI Model for Binary LJ Fluids
- Authors: Israrul H Hashmi, Rahul Karmakar, Marripelli Maniteja, Kumar Ayush, Tarak K. Patra,
- Abstract summary: We report the construction and utility of an artificial intelligence (AI) model for binary LJ fluids.<n>The model is shown to predict radial distribution functions (RDFs) for many unknown mixtures very accurately.<n>We highlight the areas where the fidelity of the AI model is low when encountering new regimes with different underlying physics.
- Score: 2.4779633742344918
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Lennard-Jones (LJ) fluids serve as an important theoretical framework for understanding molecular interactions. Binary LJ fluids, where two distinct species of particles interact based on the LJ potential, exhibit rich phase behavior and provide valuable insights of complex fluid mixtures. Here we report the construction and utility of an artificial intelligence (AI) model for binary LJ fluids, focusing on their effectiveness in predicting radial distribution functions (RDFs) across a range of conditions. The RDFs of a binary mixture with varying compositions and temperatures are collected from molecular dynamics (MD) simulations to establish and validate the AI model. In this AI pipeline, RDFs are discretized in order to reduce the output dimension of the model. This, in turn, improves the efficacy, and reduce the complexity of an AI RDF model. The model is shown to predict RDFs for many unknown mixtures very accurately, especially outside the training temperature range. Our analysis suggests that the particle size ratio has a higher order impact on the microstructure of a binary mixture. We also highlight the areas where the fidelity of the AI model is low when encountering new regimes with different underlying physics.
Related papers
- One Diffusion Step to Real-World Super-Resolution via Flow Trajectory Distillation [60.54811860967658]
FluxSR is a novel one-step diffusion Real-ISR based on flow matching models.
First, we introduce Flow Trajectory Distillation (FTD) to distill a multi-step flow matching model into a one-step Real-ISR.
Second, to improve image realism and address high-frequency artifact issues in generated images, we propose TV-LPIPS as a perceptual loss.
arXiv Detail & Related papers (2025-02-04T04:11:29Z) - Physics Informed Distillation for Diffusion Models [21.173298037358954]
We introduce Physics Informed Distillation (PID), which employs a student model to represent the solution of the ODE system corresponding to the teacher diffusion model.
We observe that PID performance achieves comparable to recent distillation methods.
arXiv Detail & Related papers (2024-11-13T07:03:47Z) - Maximum Entropy Inverse Reinforcement Learning of Diffusion Models with Energy-Based Models [12.327318533784961]
We present a maximum reinforcement learning (IRL) approach for improving the sample quality of diffusion generative models.
We train (or fine-tune) a diffusion model using the log density estimated from training data.
Our empirical studies show that diffusion models fine-tuned using DxMI can generate high-quality samples in as few as 4 and 10 steps.
arXiv Detail & Related papers (2024-06-30T08:52:17Z) - Generalization capabilities and robustness of hybrid models grounded in physics compared to purely deep learning models [2.8686437689115363]
This study investigates the generalization capabilities and robustness of purely deep learning (DL) models and hybrid models based on physical principles in fluid dynamics applications.<n>Three autoregressive models were compared: a hybrid model (POD-DL) that combines proper decomposition (POD) with a long-short term memory (LSTM) layer, a convolutional autoencoder combined with a convolutional LSTM layer, and a variational autoencoder (VAE) combined with a ConvLSTM layer.<n>While the VAE and ConvLSTM models accurately predicted laminar flow, the hybrid POD-DL model outperformed the others
arXiv Detail & Related papers (2024-04-27T12:43:02Z) - Predicting the Radiation Field of Molecular Clouds using Denoising
Diffusion Probabilistic Models [2.2215308271891403]
We employ deep learning techniques to predict the interstellar radiation field (ISRF) strength based on three-band dust emission at 4.5 um, 24 um, and 250 um.
Our model robustly predicts radiation feedback distribution, even in complex, poorly constrained ISRF environments.
arXiv Detail & Related papers (2023-09-11T20:28:43Z) - Diff-Instruct: A Universal Approach for Transferring Knowledge From
Pre-trained Diffusion Models [77.83923746319498]
We propose a framework called Diff-Instruct to instruct the training of arbitrary generative models.
We show that Diff-Instruct results in state-of-the-art single-step diffusion-based models.
Experiments on refining GAN models show that the Diff-Instruct can consistently improve the pre-trained generators of GAN models.
arXiv Detail & Related papers (2023-05-29T04:22:57Z) - Hierarchical Integration Diffusion Model for Realistic Image Deblurring [71.76410266003917]
Diffusion models (DMs) have been introduced in image deblurring and exhibited promising performance.
We propose the Hierarchical Integration Diffusion Model (HI-Diff), for realistic image deblurring.
Experiments on synthetic and real-world blur datasets demonstrate that our HI-Diff outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-05-22T12:18:20Z) - Forecasting through deep learning and modal decomposition in two-phase
concentric jets [2.362412515574206]
This work aims to improve fuel chamber injectors' performance in turbofan engines.
It requires the development of models that allow real-time prediction and improvement of the fuel/air mixture.
arXiv Detail & Related papers (2022-12-24T12:59:41Z) - Flexible Amortized Variational Inference in qBOLD MRI [56.4324135502282]
Oxygen extraction fraction (OEF) and deoxygenated blood volume (DBV) are more ambiguously determined from the data.
Existing inference methods tend to yield very noisy and underestimated OEF maps, while overestimating DBV.
This work describes a novel probabilistic machine learning approach that can infer plausible distributions of OEF and DBV.
arXiv Detail & Related papers (2022-03-11T10:47:16Z) - BIGDML: Towards Exact Machine Learning Force Fields for Materials [55.944221055171276]
Machine-learning force fields (MLFF) should be accurate, computationally and data efficient, and applicable to molecules, materials, and interfaces thereof.
Here, we introduce the Bravais-Inspired Gradient-Domain Machine Learning approach and demonstrate its ability to construct reliable force fields using a training set with just 10-200 atoms.
arXiv Detail & Related papers (2021-06-08T10:14:57Z) - Training Deep Energy-Based Models with f-Divergence Minimization [113.97274898282343]
Deep energy-based models (EBMs) are very flexible in distribution parametrization but computationally challenging.
We propose a general variational framework termed f-EBM to train EBMs using any desired f-divergence.
Experimental results demonstrate the superiority of f-EBM over contrastive divergence, as well as the benefits of training EBMs using f-divergences other than KL.
arXiv Detail & Related papers (2020-03-06T23:11:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.