Symmetry-Aware Generative Modeling through Learned Canonicalization
- URL: http://arxiv.org/abs/2501.07773v2
- Date: Mon, 03 Feb 2025 16:33:20 GMT
- Title: Symmetry-Aware Generative Modeling through Learned Canonicalization
- Authors: Kusha Sareen, Daniel Levy, Arnab Kumar Mondal, Sékou-Oumar Kaba, Tara Akhound-Sadegh, Siamak Ravanbakhsh,
- Abstract summary: Generative modeling of symmetric densities has a range of applications in AI for science, from drug discovery to physics simulations.
We propose to model a learned slice of the density so that only one representative element per orbit is learned.
Preliminary experimental results on molecular modeling are promising, demonstrating improved sample quality and faster inference time.
- Score: 20.978208085043455
- License:
- Abstract: Generative modeling of symmetric densities has a range of applications in AI for science, from drug discovery to physics simulations. The existing generative modeling paradigm for invariant densities combines an invariant prior with an equivariant generative process. However, we observe that this technique is not necessary and has several drawbacks resulting from the limitations of equivariant networks. Instead, we propose to model a learned slice of the density so that only one representative element per orbit is learned. To accomplish this, we learn a group-equivariant canonicalization network that maps training samples to a canonical pose and train a non-equivariant generative model over these canonicalized samples. We implement this idea in the context of diffusion models. Our preliminary experimental results on molecular modeling are promising, demonstrating improved sample quality and faster inference time.
Related papers
- Accelerated Diffusion Models via Speculative Sampling [89.43940130493233]
Speculative sampling is a popular technique for accelerating inference in Large Language Models.
We extend speculative sampling to diffusion models, which generate samples via continuous, vector-valued Markov chains.
We propose various drafting strategies, including a simple and effective approach that does not require training a draft model.
arXiv Detail & Related papers (2025-01-09T16:50:16Z) - Deconstructing equivariant representations in molecular systems [6.841858294458366]
We report on experiments using a simple equivariant graph convolution model on the QM9 dataset.
Our key finding is that, for a scalar prediction task, many of the irreducible representations are simply ignored during training.
We empirically show that removing some unused orders of spherical harmonics improves model performance.
arXiv Detail & Related papers (2024-10-10T17:15:46Z) - Diffeomorphic Measure Matching with Kernels for Generative Modeling [1.2058600649065618]
This article presents a framework for transport of probability measures towards minimum divergence generative modeling and sampling using ordinary differential equations (ODEs) and Reproducing Kernel Hilbert Spaces (RKHSs)
A theoretical analysis of the proposed method is presented, giving a priori error bounds in terms of the complexity of the model, the number of samples in the training set, and model misspecification.
arXiv Detail & Related papers (2024-02-12T21:44:20Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Reduce, Reuse, Recycle: Compositional Generation with Energy-Based Diffusion Models and MCMC [102.64648158034568]
diffusion models have quickly become the prevailing approach to generative modeling in many domains.
We propose an energy-based parameterization of diffusion models which enables the use of new compositional operators.
We find these samplers lead to notable improvements in compositional generation across a wide set of problems.
arXiv Detail & Related papers (2023-02-22T18:48:46Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Learning Equivariant Energy Based Models with Equivariant Stein
Variational Gradient Descent [80.73580820014242]
We focus on the problem of efficient sampling and learning of probability densities by incorporating symmetries in probabilistic models.
We first introduce Equivariant Stein Variational Gradient Descent algorithm -- an equivariant sampling method based on Stein's identity for sampling from densities with symmetries.
We propose new ways of improving and scaling up training of energy based models.
arXiv Detail & Related papers (2021-06-15T01:35:17Z) - Physical invariance in neural networks for subgrid-scale scalar flux
modeling [5.333802479607541]
We present a new strategy to model the subgrid-scale scalar flux in a three-dimensional turbulent incompressible flow using physics-informed neural networks (NNs)
We show that the proposed transformation-invariant NN model outperforms both purely data-driven ones and parametric state-of-the-art subgrid-scale models.
arXiv Detail & Related papers (2020-10-09T16:09:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.