Attention-enhanced neural differential equations for physics-informed
deep learning of ion transport
- URL: http://arxiv.org/abs/2312.02871v1
- Date: Tue, 5 Dec 2023 16:39:24 GMT
- Title: Attention-enhanced neural differential equations for physics-informed
deep learning of ion transport
- Authors: Danyal Rehman and John H. Lienhard
- Abstract summary: We develop a machine learning-based approach to characterize ion transport across nanoporous membranes.
Our proposed framework centers around attention-enhanced neural differential equations that incorporate electroneutrality-based inductive biases.
In addition, we study the role of the attention mechanism in illuminating physically-meaningful ion-pairing relationships.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Species transport models typically combine partial differential equations
(PDEs) with relations from hindered transport theory to quantify
electromigrative, convective, and diffusive transport through complex
nanoporous systems; however, these formulations are frequently substantial
simplifications of the governing dynamics, leading to the poor generalization
performance of PDE-based models. Given the growing interest in deep learning
methods for the physical sciences, we develop a machine learning-based approach
to characterize ion transport across nanoporous membranes. Our proposed
framework centers around attention-enhanced neural differential equations that
incorporate electroneutrality-based inductive biases to improve generalization
performance relative to conventional PDE-based methods. In addition, we study
the role of the attention mechanism in illuminating physically-meaningful
ion-pairing relationships across diverse mixture compositions. Further, we
investigate the importance of pre-training on simulated data from PDE-based
models, as well as the performance benefits from hard vs. soft inductive
biases. Our results indicate that physics-informed deep learning solutions can
outperform their classical PDE-based counterparts and provide promising avenues
for modelling complex transport phenomena across diverse applications.
Related papers
- Predicting ionic conductivity in solids from the machine-learned potential energy landscape [68.25662704255433]
Superionic materials are essential for advancing solid-state batteries, which offer improved energy density and safety.
Conventional computational methods for identifying such materials are resource-intensive and not easily scalable.
We propose an approach for the quick and reliable evaluation of ionic conductivity through the analysis of a universal interatomic potential.
arXiv Detail & Related papers (2024-11-11T09:01:36Z) - Neural Message Passing Induced by Energy-Constrained Diffusion [79.9193447649011]
We propose an energy-constrained diffusion model as a principled interpretable framework for understanding the mechanism of MPNNs.
We show that the new model can yield promising performance for cases where the data structures are observed (as a graph), partially observed or completely unobserved.
arXiv Detail & Related papers (2024-09-13T17:54:41Z) - Dissecting van der Waals interactions with Density Functional Theory -- Wannier-basis approach [0.0]
This is an electronic-based many-body method that captures the full electronic and optical response properties of the materials.
It provides the foundation to discern van der Waals and induction energies as well as the role of anisotropy and different stacking patterns.
Our investigation aims at stimulating new experimental studies to measure van der Waals energies in a wider range of materials.
arXiv Detail & Related papers (2024-06-17T15:42:01Z) - Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems [49.2319247825857]
We show that diffusion-based generative models exhibit many properties favourable for neural operators.
We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
arXiv Detail & Related papers (2024-05-11T21:23:55Z) - Neural Operators Learn the Local Physics of Magnetohydrodynamics [6.618373975988337]
Magnetohydrodynamics (MHD) plays a pivotal role in describing the dynamics of plasma and conductive fluids.
Recent advances introduce neural operators like the Fourier Neural Operator (FNO) as surrogate models for traditional numerical analyses.
This study explores a modified Flux Fourier neural operator model to approximate the numerical flux of ideal MHD.
arXiv Detail & Related papers (2024-04-24T17:48:38Z) - Neural oscillators for generalization of physics-informed machine
learning [1.893909284526711]
A primary challenge of physics-informed machine learning (PIML) is its generalization beyond the training domain.
This paper aims to enhance the generalization capabilities of PIML, facilitating practical, real-world applications.
We leverage the inherent causality and temporal sequential characteristics of PDE solutions to fuse PIML models with recurrent neural architectures.
arXiv Detail & Related papers (2023-08-17T13:50:03Z) - PDE+: Enhancing Generalization via PDE with Adaptive Distributional
Diffusion [66.95761172711073]
generalization of neural networks is a central challenge in machine learning.
We propose to enhance it directly through the underlying function of neural networks, rather than focusing on adjusting input data.
We put this theoretical framework into practice as $textbfPDE+$ ($textbfPDE$ with $textbfA$daptive $textbfD$istributional $textbfD$iffusion)
arXiv Detail & Related papers (2023-05-25T08:23:26Z) - Physics-constrained neural differential equations for learning
multi-ionic transport [0.0]
We develop the first physics-informed deep learning model to learn ion transport behaviour across polyamide nanopores.
We use neural differential equations in conjunction with classical closure models as inductive biases directly into the neural framework.
arXiv Detail & Related papers (2023-03-07T17:18:52Z) - Multi-resolution partial differential equations preserved learning
framework for spatiotemporal dynamics [11.981731023317945]
Physics-informed deep learning (PiDL) addresses these challenges by incorporating physical principles into the model.
We propose to leverage physics prior knowledge by baking'' the discretized governing equations into the neural network architecture.
This method, embedding discretized PDEs through convolutional residual networks in a multi-resolution setting, largely improves the generalizability and long-term prediction.
arXiv Detail & Related papers (2022-05-09T01:27:58Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Training Deep Energy-Based Models with f-Divergence Minimization [113.97274898282343]
Deep energy-based models (EBMs) are very flexible in distribution parametrization but computationally challenging.
We propose a general variational framework termed f-EBM to train EBMs using any desired f-divergence.
Experimental results demonstrate the superiority of f-EBM over contrastive divergence, as well as the benefits of training EBMs using f-divergences other than KL.
arXiv Detail & Related papers (2020-03-06T23:11:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.