Molecular Fingerprints Are Strong Models for Peptide Function Prediction
- URL: http://arxiv.org/abs/2501.17901v1
- Date: Wed, 29 Jan 2025 10:05:27 GMT
- Title: Molecular Fingerprints Are Strong Models for Peptide Function Prediction
- Authors: Jakub Adamczyk, Piotr Ludynia, Wojciech Czech,
- Abstract summary: We study the effectiveness of molecular fingerprints for peptide property prediction.
We demonstrate that domain-specific feature extraction from molecular graphs can outperform complex and computationally expensive models.
- Score: 0.0
- License:
- Abstract: We study the effectiveness of molecular fingerprints for peptide property prediction and demonstrate that domain-specific feature extraction from molecular graphs can outperform complex and computationally expensive models such as GNNs, pretrained sequence-based transformers and multimodal ensembles, even without hyperparameter tuning. To this end, we perform a thorough evaluation on 126 datasets, achieving state-of-the-art results on LRGB and 5 other peptide function prediction benchmarks. We show that models based on count variants of ECFP, Topological Torsion, and RDKit molecular fingerprints and LightGBM as classification head are remarkably robust. The strong performance of molecular fingerprints, which are intrinsically very short-range feature encoders, challenges the presumed importance of long-range interactions in peptides. Our conclusion is that the use of molecular fingerprints for larger molecules, such as peptides, can be a computationally feasible, low-parameter, and versatile alternative to sophisticated deep learning models.
Related papers
- DiffMS: Diffusion Generation of Molecules Conditioned on Mass Spectra [60.39311767532607]
DiffMS is a formula-restricted encoder-decoder generative network.
We develop a robust decoder that bridges latent embeddings and molecular structures.
Experiments show DiffMS outperforms existing models on $textitde novo$ molecule generation.
arXiv Detail & Related papers (2025-02-13T18:29:48Z) - Pre-trained Molecular Language Models with Random Functional Group Masking [54.900360309677794]
We propose a SMILES-based underlineem Molecular underlineem Language underlineem Model, which randomly masking SMILES subsequences corresponding to specific molecular atoms.
This technique aims to compel the model to better infer molecular structures and properties, thus enhancing its predictive capabilities.
arXiv Detail & Related papers (2024-11-03T01:56:15Z) - SE(3)-Invariant Multiparameter Persistent Homology for Chiral-Sensitive
Molecular Property Prediction [1.534667887016089]
We present a novel method for generating molecular fingerprints using multi parameter persistent homology (MPPH)
This technique holds considerable significance for drug discovery and materials science, where precise molecular property prediction is vital.
We demonstrate its superior performance over existing state-of-the-art methods in predicting molecular properties through extensive evaluations on the MoleculeNet benchmark.
arXiv Detail & Related papers (2023-12-12T09:33:54Z) - ADMET property prediction through combinations of molecular fingerprints [0.0]
Random forests or support vector machines paired with extended-connectivity fingerprints consistently outperformed recently developed methods.
A detailed investigation into regression algorithms and molecular fingerprints revealed gradient-boosted decision trees.
We successfully validated our model across 22 Therapeutics Data Commons ADMET benchmarks.
arXiv Detail & Related papers (2023-09-29T22:39:18Z) - Efficient Prediction of Peptide Self-assembly through Sequential and
Graphical Encoding [57.89530563948755]
This work provides a benchmark analysis of peptide encoding with advanced deep learning models.
It serves as a guide for a wide range of peptide-related predictions such as isoelectric points, hydration free energy, etc.
arXiv Detail & Related papers (2023-07-17T00:43:33Z) - Implicit Geometry and Interaction Embeddings Improve Few-Shot Molecular
Property Prediction [53.06671763877109]
We develop molecular embeddings that encode complex molecular characteristics to improve the performance of few-shot molecular property prediction.
Our approach leverages large amounts of synthetic data, namely the results of molecular docking calculations.
On multiple molecular property prediction benchmarks, training from the embedding space substantially improves Multi-Task, MAML, and Prototypical Network few-shot learning performance.
arXiv Detail & Related papers (2023-02-04T01:32:40Z) - MolCPT: Molecule Continuous Prompt Tuning to Generalize Molecular
Representation Learning [77.31492888819935]
We propose a novel paradigm of "pre-train, prompt, fine-tune" for molecular representation learning, named molecule continuous prompt tuning (MolCPT)
MolCPT defines a motif prompting function that uses the pre-trained model to project the standalone input into an expressive prompt.
Experiments on several benchmark datasets show that MolCPT efficiently generalizes pre-trained GNNs for molecular property prediction.
arXiv Detail & Related papers (2022-12-20T19:32:30Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - FP-GNN: a versatile deep learning architecture for enhanced molecular
property prediction [3.9838024725595167]
FP-GNN is a novel deep learning architecture that combined and simultaneously learned information from molecular graphs and fingerprints.
We conducted experiments on 13 public datasets, an unbiased LIT-PCBA dataset, and 14 phenotypic screening datasets for breast cell lines.
The FP-GNN algorithm achieved state-of-the-art performance on these datasets.
arXiv Detail & Related papers (2022-05-08T10:36:12Z) - Do Large Scale Molecular Language Representations Capture Important
Structural Information? [31.76876206167457]
We present molecular embeddings obtained by training an efficient transformer encoder model, referred to as MoLFormer.
Experiments show that the learned molecular representation performs competitively, when compared to graph-based and fingerprint-based supervised learning baselines.
arXiv Detail & Related papers (2021-06-17T14:33:55Z) - A Systematic Comparison Study on Hyperparameter Optimisation of Graph
Neural Networks for Molecular Property Prediction [8.02401104726362]
Graph neural networks (GNNs) have been proposed for a wide range of graph-related learning tasks.
In recent years there has been an increasing number of GNN systems that were applied to predict molecular properties.
arXiv Detail & Related papers (2021-02-08T15:40:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.