DeepChem Equivariant: SE(3)-Equivariant Support in an Open-Source Molecular Machine Learning Library
- URL: http://arxiv.org/abs/2510.16897v1
- Date: Sun, 19 Oct 2025 15:45:57 GMT
- Title: DeepChem Equivariant: SE(3)-Equivariant Support in an Open-Source Molecular Machine Learning Library
- Authors: Jose Siguenza, Bharath Ramsundar,
- Abstract summary: We extend DEEPCHEM with support for ready-to-use equivariant models, enabling scientists with minimal deep learning background to build, train, and evaluate models.<n>Our implementation includes equivariant models, complete training pipelines, and a toolkit of equivariant utilities, supported with comprehensive tests and documentation.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Neural networks that incorporate geometric relationships respecting SE(3) group transformations (e.g. rotations and translations) are increasingly important in molecular applications, such as molecular property prediction, protein structure modeling, and materials design. These models, known as SE(3)-equivariant neural networks, ensure outputs transform predictably with input coordinate changes by explicitly encoding spatial atomic positions. Although libraries such as E3NN [4] and SE(3)-TRANSFORMER [3 ] offer powerful implementations, they often require substantial deep learning or mathematical prior knowledge and lack complete training pipelines. We extend DEEPCHEM [ 13] with support for ready-to-use equivariant models, enabling scientists with minimal deep learning background to build, train, and evaluate models, such as SE(3)-Transformer and Tensor Field Networks. Our implementation includes equivariant models, complete training pipelines, and a toolkit of equivariant utilities, supported with comprehensive tests and documentation, to facilitate both application and further development of SE(3)-equivariant models.
Related papers
- Transolver-3: Scaling Up Transformer Solvers to Industrial-Scale Geometries [51.028432812178266]
Transolver-3 is a new member of the Transolver family designed for high-fidelity physics simulations.<n>We show that Transolver-3 is capable of handling meshes with over 160 million cells, achieving impressive performance across three challenging simulation benchmarks.
arXiv Detail & Related papers (2026-02-04T16:52:44Z) - Quantized SO(3)-Equivariant Graph Neural Networks for Efficient Molecular Property Prediction [12.753341915660073]
This paper addresses the problem by compressing and accelerating an SO(3)-equivariant GNN using low-bit quantization techniques.<n>Experiments on the QM9 and rMD17 molecular benchmarks demonstrate that our 8-bit models achieve accuracy on energy and force predictions comparable to full-precision baselines.<n>The proposed techniques enable the deployment of symmetry-aware GNNs in practical chemistry applications with 2.37--2.73x faster inference and 4x smaller model size.
arXiv Detail & Related papers (2026-01-05T15:36:04Z) - A Complete Guide to Spherical Equivariant Graph Transformers [0.7953056533753116]
Spherical equivariant graph neural networks (EGNNs) provide a principled framework for learning on three-dimensional molecular and biomolecular systems.<n>This guide serves as a self-contained introduction for researchers and learners seeking to understand or implement spherical EGNNs.
arXiv Detail & Related papers (2025-12-15T22:03:09Z) - EquiCPI: SE(3)-Equivariant Geometric Deep Learning for Structure-Aware Prediction of Compound-Protein Interactions [0.0]
EquiCPI is an end-to-end geometric deep learning framework that synergizes first-principles structural modeling with SE(3)-equivariant neural networks.<n>At its core, EquiCPI employs SE(3)-equivariant message passing over atomic point clouds, preserving symmetry under rotations, translations, and reflections.<n>The proposed model is evaluated on BindingDB (affinity prediction) and DUD-E (virtual screening), EquiCPI performance on par with or exceeding the state-of-the-art deep learning competitors.
arXiv Detail & Related papers (2025-04-07T00:57:08Z) - Large Language-Geometry Model: When LLM meets Equivariance [53.8505081745406]
We propose EquiLLM, a novel framework for representing 3D physical systems.<n>We show that EquiLLM delivers significant improvements over previous methods across molecular dynamics simulation, human motion simulation, and antibody design.
arXiv Detail & Related papers (2025-02-16T14:50:49Z) - Learning Modulated Transformation in GANs [69.95217723100413]
We equip the generator in generative adversarial networks (GANs) with a plug-and-play module, termed as modulated transformation module (MTM)
MTM predicts spatial offsets under the control of latent codes, based on which the convolution operation can be applied at variable locations.
It is noteworthy that towards human generation on the challenging TaiChi dataset, we improve the FID of StyleGAN3 from 21.36 to 13.60, demonstrating the efficacy of learning modulated geometry transformation.
arXiv Detail & Related papers (2023-08-29T17:51:22Z) - FAENet: Frame Averaging Equivariant GNN for Materials Modeling [123.19473575281357]
We introduce a flexible framework relying on frameaveraging (SFA) to make any model E(3)-equivariant or invariant through data transformations.
We prove the validity of our method theoretically and empirically demonstrate its superior accuracy and computational scalability in materials modeling.
arXiv Detail & Related papers (2023-04-28T21:48:31Z) - A new perspective on building efficient and expressive 3D equivariant
graph neural networks [39.0445472718248]
We propose a hierarchy of 3D isomorphism to evaluate the expressive power of equivariant GNNs.
Our work leads to two crucial modules for designing expressive and efficient geometric GNNs.
To demonstrate the applicability of our theory, we propose LEFTNet which effectively implements these modules.
arXiv Detail & Related papers (2023-04-07T18:08:27Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - GeoMol: Torsional Geometric Generation of Molecular 3D Conformer
Ensembles [60.12186997181117]
Prediction of a molecule's 3D conformer ensemble from the molecular graph holds a key role in areas of cheminformatics and drug discovery.
Existing generative models have several drawbacks including lack of modeling important molecular geometry elements.
We propose GeoMol, an end-to-end, non-autoregressive and SE(3)-invariant machine learning approach to generate 3D conformers.
arXiv Detail & Related papers (2021-06-08T14:17:59Z) - SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate
Interatomic Potentials [0.17590081165362778]
NequIP is a SE(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations.
The method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency.
arXiv Detail & Related papers (2021-01-08T18:49:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.