Pushing the limits of unconstrained machine-learned interatomic potentials
- URL: http://arxiv.org/abs/2601.16195v1
- Date: Thu, 22 Jan 2026 18:46:58 GMT
- Title: Pushing the limits of unconstrained machine-learned interatomic potentials
- Authors: Filippo Bigi, Paolo Pegolo, Arslan Mazitov, Michele Ceriotti,
- Abstract summary: Machine-learned interatomic potentials (MLIPs) are increasingly used to replace computationally demanding electronic-structure calculations.<n>We show that unconstrained models can be superior in accuracy and speed when compared to physically constrained models.
- Score: 1.4111179565053178
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine-learned interatomic potentials (MLIPs) are increasingly used to replace computationally demanding electronic-structure calculations to model matter at the atomic scale. The most commonly used model architectures are constrained to fulfill a number of physical laws exactly, from geometric symmetries to energy conservation. Evidence is mounting that relaxing some of these constraints can be beneficial to the efficiency and (somewhat surprisingly) accuracy of MLIPs, even though care should be taken to avoid qualitative failures associated with the breaking of physical symmetries. Given the recent trend of \emph{scaling up} models to larger numbers of parameters and training samples, a very important question is how unconstrained MLIPs behave in this limit. Here we investigate this issue, showing that -- when trained on large datasets -- unconstrained models can be superior in accuracy and speed when compared to physically constrained models. We assess these models both in terms of benchmark accuracy and in terms of usability in practical scenarios, focusing on static simulation workflows such as geometry optimization and lattice dynamics. We conclude that accurate unconstrained models can be applied with confidence, especially since simple inference-time modifications can be used to recover observables that are consistent with the relevant physical symmetries.
Related papers
- Equivariant Evidential Deep Learning for Interatomic Potentials [55.6997213490859]
Uncertainty quantification is critical for assessing the reliability of machine learning interatomic potentials in molecular dynamics simulations.<n>Existing UQ approaches for MLIPs are often limited by high computational cost or suboptimal performance.<n>We propose textitEquivariant Evidential Deep Learning for Interatomic Potentials ($texte2$IP), a backbone-agnostic framework that models atomic forces and their uncertainty jointly.
arXiv Detail & Related papers (2026-02-11T02:00:25Z) - From Evaluation to Design: Using Potential Energy Surface Smoothness Metrics to Guide Machine Learning Interatomic Potential Architectures [12.68400434984463]
MLIPs fail to reproduce the physical smoothness of the quantum potential energy surface.<n>Existing evaluations, such as microcanonical molecular dynamics, are computationally expensive and primarily probe near-equilibrium states.<n>We introduce the Bond Smoothness Characterization Test (BSCT) to improve evaluation metrics for MLIPs.
arXiv Detail & Related papers (2026-02-04T18:50:10Z) - Conditional Denoising Model as a Physical Surrogate Model [1.0616273526777913]
We introduce a generative model designed to learn the geometry of the physical manifold itself.<n>By training the network to restore clean states from noisy ones, the model learns a vector field that points continuously towards the valid solution subspace.
arXiv Detail & Related papers (2026-01-28T20:32:20Z) - SEAL - A Symmetry EncourAging Loss for High Energy Physics [0.005211875900848231]
Building machine learning models that explicitly respect symmetries can be difficult due to the dedicated components required.<n>We introduce soft constraints that allow the model to decide the importance of added symmetries during the learning process instead of enforcing exact symmetries.
arXiv Detail & Related papers (2025-11-03T19:00:13Z) - Reframing Generative Models for Physical Systems using Stochastic Interpolants [45.16806809746592]
Generative models have emerged as powerful surrogates for physical systems, demonstrating increased accuracy, stability, and/or statistical fidelity.<n>Most approaches rely on iteratively denoising a Gaussian, a choice that may not be the most effective for autoregressive prediction tasks in PDEs and dynamical systems such as climate.<n>In this work, we benchmark generative models across diverse physical domains and tasks, and highlight the role of interpolants.
arXiv Detail & Related papers (2025-09-30T14:02:00Z) - Training-Free Constrained Generation With Stable Diffusion Models [41.391765899175276]
Stable diffusion models represent the state-of-the-art in data synthesis across diverse domains.<n>Existing techniques are either limited in their applicability to latent diffusion frameworks or lack the capability to strictly enforce domain-specific constraints.<n>This paper proposes a novel integration of stable diffusion models with constrained optimization frameworks, enabling the generation of outputs satisfying stringent physical and functional requirements.
arXiv Detail & Related papers (2025-02-08T16:11:17Z) - Probing the effects of broken symmetries in machine learning [0.0]
We show that non-symmetric models can learn symmetries from data, and that doing so can even be beneficial for the accuracy of the model.
We focus specifically on physical observables that are likely to be affected -- directly or indirectly -- by symmetry breaking, finding negligible consequences when the model is used in an interpolative, bulk, regime.
arXiv Detail & Related papers (2024-06-25T17:34:09Z) - Scaling and renormalization in high-dimensional regression [72.59731158970894]
We present a unifying perspective on recent results on ridge regression.<n>We use the basic tools of random matrix theory and free probability, aimed at readers with backgrounds in physics and deep learning.<n>Our results extend and provide a unifying perspective on earlier models of scaling laws.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Similarity Equivariant Graph Neural Networks for Homogenization of Metamaterials [3.6443770850509423]
Soft, porous mechanical metamaterials exhibit pattern transformations that may have important applications in soft robotics, sound reduction and biomedicine.<n>We develop a machine learning-based approach that scales favorably to serve as a surrogate model.<n>We show that this network is more accurate and data-efficient than graph neural networks with fewer symmetries.
arXiv Detail & Related papers (2024-04-26T12:30:32Z) - Quantum-informed simulations for mechanics of materials: DFTB+MBD framework [40.83978401377059]
We study how quantum effects can modify the mechanical properties of systems relevant to materials engineering.
We provide an open-source repository containing all codes, datasets, and examples presented in this work.
arXiv Detail & Related papers (2024-04-05T16:59:01Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - A data-driven peridynamic continuum model for upscaling molecular
dynamics [3.1196544696082613]
We propose a learning framework to extract, from molecular dynamics data, an optimal Linear Peridynamic Solid model.
We provide sufficient well-posedness conditions for discretized LPS models with sign-changing influence functions.
This framework guarantees that the resulting model is mathematically well-posed, physically consistent, and that it generalizes well to settings that are different from the ones used during training.
arXiv Detail & Related papers (2021-08-04T07:07:47Z) - Uncertainty estimation for molecular dynamics and sampling [0.0]
Machine learning models have emerged as a very effective strategy to sidestep time-consuming electronic-structure calculations.
It is crucial to obtain an estimate of the error that derives from the finite number of reference structures included during the training of the model.
We present examples covering different types of structural and thermodynamic properties, and systems as diverse as water and liquid gallium.
arXiv Detail & Related papers (2020-11-10T00:07:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.