HyperDiffusionFields (HyDiF): Diffusion-Guided Hypernetworks for Learning Implicit Molecular Neural Fields
- URL: http://arxiv.org/abs/2510.18122v1
- Date: Mon, 20 Oct 2025 21:41:10 GMT
- Title: HyperDiffusionFields (HyDiF): Diffusion-Guided Hypernetworks for Learning Implicit Molecular Neural Fields
- Authors: Sudarshan Babu, Phillip Lo, Xiao Zhang, Aadi Srivastava, Ali Davariashtiyani, Jason Perera, Michael Maire, Aly A. Khan,
- Abstract summary: We introduce HyperDiffusionFields (HyDiF), a framework that models 3D molecular conformers as continuous fields.<n>At the core of our approach is the Molecular Directional Field (MDF), a vector field that maps any point in space to the direction of the nearest atom of a particular type.<n>We demonstrate that our approach scales to larger biomolecules, illustrating a promising direction for field-based molecular modeling.
- Score: 12.849722578846178
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce HyperDiffusionFields (HyDiF), a framework that models 3D molecular conformers as continuous fields rather than discrete atomic coordinates or graphs. At the core of our approach is the Molecular Directional Field (MDF), a vector field that maps any point in space to the direction of the nearest atom of a particular type. We represent MDFs using molecule-specific neural implicit fields, which we call Molecular Neural Fields (MNFs). To enable learning across molecules and facilitate generalization, we adopt an approach where a shared hypernetwork, conditioned on a molecule, generates the weights of the given molecule's MNF. To endow the model with generative capabilities, we train the hypernetwork as a denoising diffusion model, enabling sampling in the function space of molecular fields. Our design naturally extends to a masked diffusion mechanism to support structure-conditioned generation tasks, such as molecular inpainting, by selectively noising regions of the field. Beyond generation, the localized and continuous nature of MDFs enables spatially fine-grained feature extraction for molecular property prediction, something not easily achievable with graph or point cloud based methods. Furthermore, we demonstrate that our approach scales to larger biomolecules, illustrating a promising direction for field-based molecular modeling.
Related papers
- Molecular Representations in Implicit Functional Space via Hyper-Networks [53.70982267248536]
We argue that molecular learning can instead be formulated as learning in function space.<n>We instantiate this formulation with MolField, a hyper-network-based framework that learns distributions over molecular fields.<n>Our results show that treating molecules as continuous functions fundamentally changes how molecular representations generalize across tasks.
arXiv Detail & Related papers (2026-01-29T21:13:37Z) - Torsional-GFN: a conditional conformation generator for small molecules [75.91029322687771]
We introduce a conditional GFlowNet specifically designed to sample conformations of molecules proportionally to their Boltzmann distribution.<n>Our work presents a promising avenue for scaling the proposed approach to larger molecular systems.
arXiv Detail & Related papers (2025-07-15T21:53:25Z) - LDMol: A Text-to-Molecule Diffusion Model with Structurally Informative Latent Space Surpasses AR Models [55.5427001668863]
We present a novel latent diffusion model dubbed LDMol for text-conditioned molecule generation.<n> Experiments show that LDMol outperforms the existing autoregressive baselines on the text-to-molecule generation benchmark.<n>We show that LDMol can be applied to downstream tasks such as molecule-to-text retrieval and text-guided molecule editing.
arXiv Detail & Related papers (2024-05-28T04:59:13Z) - Pre-training of Molecular GNNs via Conditional Boltzmann Generator [0.0]
We propose a pre-training method for molecular GNNs using an existing dataset of molecular conformations.
We show that our model has a better prediction performance for molecular properties than existing pre-training methods.
arXiv Detail & Related papers (2023-12-20T15:30:15Z) - Swallowing the Bitter Pill: Simplified Scalable Conformer Generation [12.341835649897886]
We present a novel way to predict molecular conformers through a simple formulation that sidesteps many of the equis of prior works and achieves state of the art results by using the advantages of scale.
We are able to radically simplify structure learning, and make it trivial to scale up the model sizes.
This model, called Molecular Conformer Fields (MCF), works by parameterizing conformer structures as functions that map elements from a molecular graph directly to their 3D location in space.
arXiv Detail & Related papers (2023-11-27T22:53:41Z) - Molecule Design by Latent Space Energy-Based Modeling and Gradual
Distribution Shifting [53.44684898432997]
Generation of molecules with desired chemical and biological properties is critical for drug discovery.
We propose a probabilistic generative model to capture the joint distribution of molecules and their properties.
Our method achieves very strong performances on various molecule design tasks.
arXiv Detail & Related papers (2023-06-09T03:04:21Z) - Towards Predicting Equilibrium Distributions for Molecular Systems with
Deep Learning [60.02391969049972]
We introduce a novel deep learning framework, called Distributional Graphormer (DiG), in an attempt to predict the equilibrium distribution of molecular systems.
DiG employs deep neural networks to transform a simple distribution towards the equilibrium distribution, conditioned on a descriptor of a molecular system.
arXiv Detail & Related papers (2023-06-08T17:12:08Z) - MUDiff: Unified Diffusion for Complete Molecule Generation [104.7021929437504]
We present a new model for generating a comprehensive representation of molecules, including atom features, 2D discrete molecule structures, and 3D continuous molecule coordinates.
We propose a novel graph transformer architecture to denoise the diffusion process.
Our model is a promising approach for designing stable and diverse molecules and can be applied to a wide range of tasks in molecular modeling.
arXiv Detail & Related papers (2023-04-28T04:25:57Z) - Learning Harmonic Molecular Representations on Riemannian Manifold [18.49126496517951]
Molecular representation learning plays a crucial role in AI-assisted drug discovery research.
We propose a Harmonic Molecular Representation learning framework, which represents a molecule using the Laplace-Beltrami eigenfunctions of its molecular surface.
arXiv Detail & Related papers (2023-03-27T18:02:47Z) - Exploring Chemical Space with Score-based Out-of-distribution Generation [57.15855198512551]
We propose a score-based diffusion scheme that incorporates out-of-distribution control in the generative differential equation (SDE)
Since some novel molecules may not meet the basic requirements of real-world drugs, MOOD performs conditional generation by utilizing the gradients from a property predictor.
We experimentally validate that MOOD is able to explore the chemical space beyond the training distribution, generating molecules that outscore ones found with existing methods, and even the top 0.01% of the original training pool.
arXiv Detail & Related papers (2022-06-06T06:17:11Z) - An Extendible, Graph-Neural-Network-Based Approach for Accurate Force
Field Development of Large Flexible Organic Molecules [4.456834955307613]
We develop an extendible ab initio force field for large flexible organic molecules at CW level of accuracy.
Tests on polyethylene glycol polymer chains show that our strategy is highly accurate and robust for molecules of different sizes.
arXiv Detail & Related papers (2021-06-02T04:12:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.