A Perspective on Deep Learning for Molecular Modeling and Simulations
- URL: http://arxiv.org/abs/2004.13011v1
- Date: Sat, 25 Apr 2020 22:58:25 GMT
- Title: A Perspective on Deep Learning for Molecular Modeling and Simulations
- Authors: Jun Zhang, Yao-Kun Lei, Zhen Zhang, Junhan Chang, Maodong Li, Xu Han,
Lijiang Yang, Yi Isaac Yang and Yi Qin Gao
- Abstract summary: We focus on the limitations of traditional deep learning models from the perspective of molecular physics.
We summarized several representative applications, ranging from supervised to unsupervised and reinforcement learning.
We outlook promising directions which may help address the existing issues in the current framework of deep molecular modeling.
- Score: 8.891007063629187
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep learning is transforming many areas in science, and it has great
potential in modeling molecular systems. However, unlike the mature deployment
of deep learning in computer vision and natural language processing, its
development in molecular modeling and simulations is still at an early stage,
largely because the inductive biases of molecules are completely different from
those of images or texts. Footed on these differences, we first reviewed the
limitations of traditional deep learning models from the perspective of
molecular physics, and wrapped up some relevant technical advancement at the
interface between molecular modeling and deep learning. We do not focus merely
on the ever more complex neural network models, instead, we emphasize the
theories and ideas behind modern deep learning. We hope that transacting these
ideas into molecular modeling will create new opportunities. For this purpose,
we summarized several representative applications, ranging from supervised to
unsupervised and reinforcement learning, and discussed their connections with
the emerging trends in deep learning. Finally, we outlook promising directions
which may help address the existing issues in the current framework of deep
molecular modeling.
Related papers
- GraphXForm: Graph transformer for computer-aided molecular design with application to extraction [73.1842164721868]
We present GraphXForm, a decoder-only graph transformer architecture, which is pretrained on existing compounds and then fine-tuned.
We evaluate it on two solvent design tasks for liquid-liquid extraction, showing that it outperforms four state-of-the-art molecular design techniques.
arXiv Detail & Related papers (2024-11-03T19:45:15Z) - Bridging Text and Molecule: A Survey on Multimodal Frameworks for Molecule [16.641797535842752]
In this paper, we present the first systematic survey on multimodal frameworks for molecules research.
We begin with the development of molecular deep learning and point out the necessity to involve textual modality.
Furthermore, we delves into the utilization of large language models and prompting techniques for molecular tasks and present significant applications in drug discovery.
arXiv Detail & Related papers (2024-03-07T03:03:13Z) - Masked Modeling for Self-supervised Representation Learning on Vision
and Beyond [69.64364187449773]
Masked modeling has emerged as a distinctive approach that involves predicting parts of the original data that are proportionally masked during training.
We elaborate on the details of techniques within masked modeling, including diverse masking strategies, recovering targets, network architectures, and more.
We conclude by discussing the limitations of current techniques and point out several potential avenues for advancing masked modeling research.
arXiv Detail & Related papers (2023-12-31T12:03:21Z) - Bidirectional Generation of Structure and Properties Through a Single
Molecular Foundation Model [44.60174246341653]
We present a novel multimodal molecular pre-trained model that incorporates the modalities of structure and biochemical properties.
Our proposed model pipeline of data handling and training objectives aligns the structure/property features in a common embedding space.
These contributions emerge synergistic knowledge, allowing us to tackle both multimodal and unimodal downstream tasks through a single model.
arXiv Detail & Related papers (2022-11-19T05:16:08Z) - A Molecular Multimodal Foundation Model Associating Molecule Graphs with
Natural Language [63.60376252491507]
We propose a molecular multimodal foundation model which is pretrained from molecular graphs and their semantically related textual data.
We believe that our model would have a broad impact on AI-empowered fields across disciplines such as biology, chemistry, materials, environment, and medicine.
arXiv Detail & Related papers (2022-09-12T00:56:57Z) - MolGenSurvey: A Systematic Survey in Machine Learning Models for
Molecule Design [46.06839497430207]
Due to the large searching space, it is impossible for human experts to enumerate and test all molecules in wet-lab experiments.
With the rapid development of machine learning methods, molecule design has achieved great progress by leveraging machine learning models to generate candidate molecules.
arXiv Detail & Related papers (2022-03-28T05:05:11Z) - Knowledge-informed Molecular Learning: A Survey on Paradigm Transfer [20.893861195128643]
Machine learning, notably deep learning, has significantly propelled molecular investigations within the biochemical sphere.
Traditionally, modeling for such research has centered around a handful of paradigms.
To enhance the generation and decipherability of purely data-driven models, scholars have integrated biochemical domain knowledge into these molecular study models.
arXiv Detail & Related papers (2022-02-17T06:18:02Z) - Learning Neural Generative Dynamics for Molecular Conformation
Generation [89.03173504444415]
We study how to generate molecule conformations (textiti.e., 3D structures) from a molecular graph.
We propose a novel probabilistic framework to generate valid and diverse conformations given a molecular graph.
arXiv Detail & Related papers (2021-02-20T03:17:58Z) - Molecular CT: Unifying Geometry and Representation Learning for
Molecules at Different Scales [3.987395340580183]
A new deep neural network architecture, Molecular Configuration Transformer ( Molecular CT), is introduced for this purpose.
The computational efficiency and universality make Molecular CT versatile for a variety of molecular learning scenarios.
As examples, we show that Molecular CT enables representational learning for molecular systems at different scales, and achieves comparable or improved results on common benchmarks.
arXiv Detail & Related papers (2020-12-22T03:41:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.