MLatom 3: Platform for machine learning-enhanced computational chemistry
simulations and workflows
- URL: http://arxiv.org/abs/2310.20155v1
- Date: Tue, 31 Oct 2023 03:41:39 GMT
- Title: MLatom 3: Platform for machine learning-enhanced computational chemistry
simulations and workflows
- Authors: Pavlo O. Dral, Fuchun Ge, Yi-Fan Hou, Peikun Zheng, Yuxinxin Chen,
Mario Barbatti, Olexandr Isayev, Cheng Wang, Bao-Xin Xue, Max Pinheiro Jr,
Yuming Su, Yiheng Dai, Yangtao Chen, Lina Zhang, Shuang Zhang, Arif Ullah,
Quanhao Zhang, Yanchi Ou
- Abstract summary: Machine learning (ML) is increasingly becoming a common tool in computational chemistry.
MLatom 3 is a program package designed to leverage the power of ML to enhance typical computational chemistry simulations.
The users can choose from an extensive library of methods containing pre-trained ML models and quantum mechanical approximations.
- Score: 12.337972297411003
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine learning (ML) is increasingly becoming a common tool in computational
chemistry. At the same time, the rapid development of ML methods requires a
flexible software framework for designing custom workflows. MLatom 3 is a
program package designed to leverage the power of ML to enhance typical
computational chemistry simulations and to create complex workflows. This
open-source package provides plenty of choice to the users who can run
simulations with the command line options, input files, or with scripts using
MLatom as a Python package, both on their computers and on the online XACS
cloud computing at XACScloud.com. Computational chemists can calculate energies
and thermochemical properties, optimize geometries, run molecular and quantum
dynamics, and simulate (ro)vibrational, one-photon UV/vis absorption, and
two-photon absorption spectra with ML, quantum mechanical, and combined models.
The users can choose from an extensive library of methods containing
pre-trained ML models and quantum mechanical approximations such as AIQM1
approaching coupled-cluster accuracy. The developers can build their own models
using various ML algorithms. The great flexibility of MLatom is largely due to
the extensive use of the interfaces to many state-of-the-art software packages
and libraries.
Related papers
- DeeR-VLA: Dynamic Inference of Multimodal Large Language Models for Efficient Robot Execution [114.61347672265076]
Development of MLLMs for real-world robots is challenging due to the typically limited computation and memory capacities available on robotic platforms.
We propose a Dynamic Early-Exit Framework for Robotic Vision-Language-Action Model (DeeR) that automatically adjusts the size of the activated MLLM.
DeeR demonstrates significant reductions in computational costs of LLM by 5.2-6.5x and GPU memory of LLM by 2-6x without compromising performance.
arXiv Detail & Related papers (2024-11-04T18:26:08Z) - AQMLator -- An Auto Quantum Machine Learning E-Platform [0.0]
AQMLator aims to automatically propose and train the quantum layers of an ML model with minimal input from the user.
It uses standard ML libraries, making it easy to introduce into existing ML pipelines.
arXiv Detail & Related papers (2024-09-26T23:23:27Z) - Quantum Extreme Learning of molecular potential energy surfaces and force fields [5.13730975608994]
A quantum neural network is used to learn the potential energy surface and force field of molecular systems.
This particular supervised learning routine allows for resource-efficient training, consisting of a simple linear regression performed on a classical computer.
We have tested a setup that can be used to study molecules of any dimension and is optimized for immediate use on NISQ devices.
Compared to other supervised learning routines, the proposed setup requires minimal quantum resources, making it feasible for direct implementation on quantum platforms.
arXiv Detail & Related papers (2024-06-20T18:00:01Z) - MESS: Modern Electronic Structure Simulations [0.0]
Electronic structure simulation (ESS) has been used for decades to provide quantitative scientific insights on an atomistic scale.
The recent introduction of machine learning (ML) into these domains has meant that ML models must be coded in languages such as FORTRAN and C.
We introduce MESS: a modern electronic structure simulation package implemented in JAX; porting the ESS code to the ML world.
arXiv Detail & Related papers (2024-06-05T10:15:16Z) - LLMC: Benchmarking Large Language Model Quantization with a Versatile Compression Toolkit [55.73370804397226]
Quantization, a key compression technique, can effectively mitigate these demands by compressing and accelerating large language models.
We present LLMC, a plug-and-play compression toolkit, to fairly and systematically explore the impact of quantization.
Powered by this versatile toolkit, our benchmark covers three key aspects: calibration data, algorithms (three strategies), and data formats.
arXiv Detail & Related papers (2024-05-09T11:49:05Z) - sQUlearn -- A Python Library for Quantum Machine Learning [0.0]
sQUlearn introduces a user-friendly, NISQ-ready Python library for quantum machine learning (QML)
The library's dual-layer architecture serves both QML researchers and practitioners.
arXiv Detail & Related papers (2023-11-15T14:22:53Z) - In Situ Framework for Coupling Simulation and Machine Learning with
Application to CFD [51.04126395480625]
Recent years have seen many successful applications of machine learning (ML) to facilitate fluid dynamic computations.
As simulations grow, generating new training datasets for traditional offline learning creates I/O and storage bottlenecks.
This work offers a solution by simplifying this coupling and enabling in situ training and inference on heterogeneous clusters.
arXiv Detail & Related papers (2023-06-22T14:07:54Z) - Deep learning applied to computational mechanics: A comprehensive
review, state of the art, and the classics [77.34726150561087]
Recent developments in artificial neural networks, particularly deep learning (DL), are reviewed in detail.
Both hybrid and pure machine learning (ML) methods are discussed.
History and limitations of AI are recounted and discussed, with particular attention at pointing out misstatements or misconceptions of the classics.
arXiv Detail & Related papers (2022-12-18T02:03:00Z) - Colmena: Scalable Machine-Learning-Based Steering of Ensemble
Simulations for High Performance Computing [3.5604179670745237]
We present Colmena, an open-source Python framework that allows users to steer campaigns by providing just the implementations of individual tasks.
Colmena handles task dispatch, results collation, ML model invocation, and ML model (re)training, using Parsl to execute tasks on HPC systems.
We describe the design of Colmena and illustrate its capabilities by applying it to electrolyte design, where it both scales to 65536 CPUs and accelerates the discovery rate for high-performance molecules by a factor of 100 over unguided searches.
arXiv Detail & Related papers (2021-10-06T14:56:53Z) - QuaSiMo: A Composable Library to Program Hybrid Workflows for Quantum
Simulation [48.341084094844746]
We present a composable design scheme for the development of hybrid quantum/classical algorithms and for applications of quantum simulation.
We implement our design scheme using the hardware-agnostic programming language QCOR into the QuaSiMo library.
arXiv Detail & Related papers (2021-05-17T16:17:57Z) - A backend-agnostic, quantum-classical framework for simulations of
chemistry in C++ [62.997667081978825]
We present the XACC system-level quantum computing framework as a platform for prototyping, developing, and deploying quantum-classical software.
A series of examples demonstrating some of the state-of-the-art chemistry algorithms currently implemented in XACC are presented.
arXiv Detail & Related papers (2021-05-04T16:53:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.