Mass Balance Approximation of Unfolding Improves Potential-Like Methods for Protein Stability Predictions
- URL: http://arxiv.org/abs/2504.06806v1
- Date: Wed, 09 Apr 2025 11:53:02 GMT
- Title: Mass Balance Approximation of Unfolding Improves Potential-Like Methods for Protein Stability Predictions
- Authors: Ivan Rossi, Guido Barducci, Tiziana Sanavia, Paola Turina, Emidio Capriotti, Piero Fariselli,
- Abstract summary: Deep-learning strategies have pushed the field forward, but their use in standard methods remains limited due to resource demands.<n>This study shows that incorporating a mass-balance correction (MBC) to account for the unfolded state significantly enhances these methods.<n>While many machine learning models partially model this balance, our analysis suggests that a refined representation of the unfolded state may improve the predictive performance.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The prediction of protein stability changes following single-point mutations plays a pivotal role in computational biology, particularly in areas like drug discovery, enzyme reengineering, and genetic disease analysis. Although deep-learning strategies have pushed the field forward, their use in standard workflows remains limited due to resource demands. Conversely, potential-like methods are fast, intuitive, and efficient. Yet, these typically estimate Gibbs free energy shifts without considering the free-energy variations in the unfolded protein state, an omission that may breach mass balance and diminish accuracy. This study shows that incorporating a mass-balance correction (MBC) to account for the unfolded state significantly enhances these methods. While many machine learning models partially model this balance, our analysis suggests that a refined representation of the unfolded state may improve the predictive performance.
Related papers
- Beyond Progress Measures: Theoretical Insights into the Mechanism of Grokking [50.465604300990904]
Grokking refers to the abrupt improvement in test accuracy after extended overfitting.<n>We investigate the grokking mechanism underlying the Transformer in the task of prime number operations.
arXiv Detail & Related papers (2025-04-04T04:42:38Z) - AlgoRxplorers | Precision in Mutation: Enhancing Drug Design with Advanced Protein Stability Prediction Tools [0.6749750044497732]
Predicting the impact of single-point amino acid mutations on protein stability is essential for understanding disease mechanisms and advancing drug development.
Protein stability, quantified by changes in Gibbs free energy ($DeltaDelta G$), is influenced by these mutations.
This study proposes the application of deep neural networks, leveraging transfer learning and fusing complementary information from different models, to create a feature-rich representation of the protein stability landscape.
arXiv Detail & Related papers (2025-01-13T02:17:01Z) - Kermut: Composite kernel regression for protein variant effects [0.9262403397108374]
We provide a process regression model, Kermut, with a novel composite kernel for modeling mutation similarity.
An analysis of the quality of the uncertainty estimates demonstrates that our model provides meaningful levels of overall calibration.
arXiv Detail & Related papers (2024-04-09T14:08:06Z) - Protein Conformation Generation via Force-Guided SE(3) Diffusion Models [48.48934625235448]
Deep generative modeling techniques have been employed to generate novel protein conformations.
We propose a force-guided SE(3) diffusion model, ConfDiff, for protein conformation generation.
arXiv Detail & Related papers (2024-03-21T02:44:08Z) - Efficiently Predicting Protein Stability Changes Upon Single-point
Mutation with Large Language Models [51.57843608615827]
The ability to precisely predict protein thermostability is pivotal for various subfields and applications in biochemistry.
We introduce an ESM-assisted efficient approach that integrates protein sequence and structural features to predict the thermostability changes in protein upon single-point mutations.
arXiv Detail & Related papers (2023-12-07T03:25:49Z) - Unbalanced Diffusion Schr\"odinger Bridge [71.31485908125435]
We introduce unbalanced DSBs which model the temporal evolution of marginals with arbitrary finite mass.
This is achieved by deriving the time reversal of differential equations with killing and birth terms.
We present two novel algorithmic schemes that comprise a scalable objective function for training unbalanced DSBs.
arXiv Detail & Related papers (2023-06-15T12:51:56Z) - Predicting protein stability changes under multiple amino acid
substitutions using equivariant graph neural networks [2.5137859989323537]
We propose improvements to state-of-the-art Deep learning (DL) protein stability prediction models.
This was achieved using E(3)-equivariant graph neural networks (EGNNs) for both atomic environment (AE) embedding and residue-level scoring tasks.
We demonstrate the immediately promising results of this procedure, discuss the current shortcomings, and highlight potential future strategies.
arXiv Detail & Related papers (2023-05-30T14:48:06Z) - Graph Neural Network Interatomic Potential Ensembles with Calibrated
Aleatoric and Epistemic Uncertainty on Energy and Forces [9.378581265532006]
We present a complete framework for training and recalibrating graph neural network ensemble models to produce accurate predictions of energy and forces.
The proposed method considers both epistemic and aleatoric uncertainty and the total uncertainties are recalibrated post hoc.
A detailed analysis of the predictive performance and uncertainty calibration is provided.
arXiv Detail & Related papers (2023-05-10T13:03:06Z) - Agnostic Physics-Driven Deep Learning [82.89993762912795]
This work establishes that a physical system can perform statistical gradient learning without gradient computations.
In Aeqprop, the specifics of the system do not have to be known: the procedure is based on external manipulations.
Aeqprop also establishes that in natural (bio)physical systems, genuine gradient-based statistical learning may result from generic, relatively simple mechanisms.
arXiv Detail & Related papers (2022-05-30T12:02:53Z) - SPLDExtraTrees: Robust machine learning approach for predicting kinase
inhibitor resistance [1.0674604700001966]
We propose a robust machine learning method, SPLDExtraTrees, which can accurately predict ligand binding affinity changes upon protein mutation.
The proposed method ranks training data following a specific scheme that starts with easy-to-learn samples.
Experiments substantiate the capability of the proposed method for predicting kinase inhibitor resistance under three scenarios.
arXiv Detail & Related papers (2021-11-15T09:07:45Z) - Benchmarking adaptive variational quantum eigensolvers [63.277656713454284]
We benchmark the accuracy of VQE and ADAPT-VQE to calculate the electronic ground states and potential energy curves.
We find both methods provide good estimates of the energy and ground state.
gradient-based optimization is more economical and delivers superior performance than analogous simulations carried out with gradient-frees.
arXiv Detail & Related papers (2020-11-02T19:52:04Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.