Leveraging Active Subspaces to Capture Epistemic Model Uncertainty in Deep Generative Models for Molecular Design
- URL: http://arxiv.org/abs/2405.00202v2
- Date: Thu, 15 Aug 2024 21:08:13 GMT
- Title: Leveraging Active Subspaces to Capture Epistemic Model Uncertainty in Deep Generative Models for Molecular Design
- Authors: A N M Nafiz Abeer, Sanket Jantre, Nathan M Urban, Byung-Jun Yoon,
- Abstract summary: generative molecular design models have seen fewer efforts on uncertainty quantification (UQ) due to computational challenges in Bayesian inference posed by their large number of parameters.
In this work, we focus on the junction-tree variational autoencoder (JT-VAE), a popular model for generative molecular design, and address this issue by leveraging the low dimensional active subspace to capture the uncertainty in the model parameters.
- Score: 2.0701439270461184
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep generative models have been accelerating the inverse design process in material and drug design. Unlike their counterpart property predictors in typical molecular design frameworks, generative molecular design models have seen fewer efforts on uncertainty quantification (UQ) due to computational challenges in Bayesian inference posed by their large number of parameters. In this work, we focus on the junction-tree variational autoencoder (JT-VAE), a popular model for generative molecular design, and address this issue by leveraging the low dimensional active subspace to capture the uncertainty in the model parameters. Specifically, we approximate the posterior distribution over the active subspace parameters to estimate the epistemic model uncertainty in an extremely high dimensional parameter space. The proposed UQ scheme does not require alteration of the model architecture, making it readily applicable to any pre-trained model. Our experiments demonstrate the efficacy of the AS-based UQ and its potential impact on molecular optimization by exploring the model diversity under epistemic uncertainty.
Related papers
- Steering Masked Discrete Diffusion Models via Discrete Denoising Posterior Prediction [88.65168366064061]
We introduce Discrete Denoising Posterior Prediction (DDPP), a novel framework that casts the task of steering pre-trained MDMs as a problem of probabilistic inference.
Our framework leads to a family of three novel objectives that are all simulation-free, and thus scalable.
We substantiate our designs via wet-lab validation, where we observe transient expression of reward-optimized protein sequences.
arXiv Detail & Related papers (2024-10-10T17:18:30Z) - SMILE: Zero-Shot Sparse Mixture of Low-Rank Experts Construction From Pre-Trained Foundation Models [85.67096251281191]
We present an innovative approach to model fusion called zero-shot Sparse MIxture of Low-rank Experts (SMILE) construction.
SMILE allows for the upscaling of source models into an MoE model without extra data or further training.
We conduct extensive experiments across diverse scenarios, such as image classification and text generation tasks, using full fine-tuning and LoRA fine-tuning.
arXiv Detail & Related papers (2024-08-19T17:32:15Z) - Enhancing Generative Molecular Design via Uncertainty-guided Fine-tuning of Variational Autoencoders [2.0701439270461184]
A critical challenge for pre-trained generative molecular design models is to fine-tune them to be better suited for downstream design tasks.
In this work, we propose a novel approach for a generative uncertainty decoder (VAE)-based GMD model through performance feedback in an active setting.
arXiv Detail & Related papers (2024-05-31T02:00:25Z) - Molecule Design by Latent Prompt Transformer [76.2112075557233]
This work explores the challenging problem of molecule design by framing it as a conditional generative modeling task.
We propose a novel generative model comprising three components: (1) a latent vector with a learnable prior distribution; (2) a molecule generation model based on a causal Transformer, which uses the latent vector as a prompt; and (3) a property prediction model that predicts a molecule's target properties and/or constraint values using the latent prompt.
arXiv Detail & Related papers (2024-02-27T03:33:23Z) - Polynomial Chaos Surrogate Construction for Random Fields with Parametric Uncertainty [0.0]
Surrogate models provide a means of circumventing the high computational expense of complex models.
We develop a PCE surrogate on a joint space of intrinsic and parametric uncertainty, enabled by Rosenblatt.
We then take advantage of closed-form solutions for computing PCE Sobol indices to perform a global sensitivity analysis of the model.
arXiv Detail & Related papers (2023-11-01T14:41:54Z) - Conditional Korhunen-Lo\'{e}ve regression model with Basis Adaptation
for high-dimensional problems: uncertainty quantification and inverse
modeling [62.997667081978825]
We propose a methodology for improving the accuracy of surrogate models of the observable response of physical systems.
We apply the proposed methodology to constructing surrogate models via the Basis Adaptation (BA) method of the stationary hydraulic head response.
arXiv Detail & Related papers (2023-07-05T18:14:38Z) - Variational Autoencoding Molecular Graphs with Denoising Diffusion
Probabilistic Model [0.0]
We propose a novel deep generative model that incorporates a hierarchical structure into the probabilistic latent vectors.
We demonstrate that our model can design effective molecular latent vectors for molecular property prediction from some experiments by small datasets on physical properties and activity.
arXiv Detail & Related papers (2023-07-02T17:29:41Z) - Molecule Design by Latent Space Energy-Based Modeling and Gradual
Distribution Shifting [53.44684898432997]
Generation of molecules with desired chemical and biological properties is critical for drug discovery.
We propose a probabilistic generative model to capture the joint distribution of molecules and their properties.
Our method achieves very strong performances on various molecule design tasks.
arXiv Detail & Related papers (2023-06-09T03:04:21Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z) - Data-driven Uncertainty Quantification in Computational Human Head
Models [0.6745502291821954]
Modern biofidelic head model simulations are associated with very high computational cost and high-dimensional inputs and outputs.
In this study, a two-stage, data-driven manifold learning-based framework is proposed for uncertainty quantification (UQ) of computational head models.
It is demonstrated that the surrogate models provide highly accurate approximations of the computational model while significantly reducing the computational cost.
arXiv Detail & Related papers (2021-10-29T05:42:31Z) - Physics-Constrained Predictive Molecular Latent Space Discovery with
Graph Scattering Variational Autoencoder [0.0]
We develop a molecular generative model based on variational inference and graph theory in the small data regime.
The model's performance is evaluated by generating molecules with desired target properties.
arXiv Detail & Related papers (2020-09-29T09:05:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.