An Artificial Intelligence (AI) workflow for catalyst design and
optimization
- URL: http://arxiv.org/abs/2402.04557v1
- Date: Wed, 7 Feb 2024 03:25:08 GMT
- Title: An Artificial Intelligence (AI) workflow for catalyst design and
optimization
- Authors: Nung Siong Lai, Yi Shen Tew, Xialin Zhong, Jun Yin, Jiali Li, Binhang
Yan, Xiaonan Wang
- Abstract summary: This study proposes an innovative Artificial Intelligence (AI) workflow that integrates Large Language Models (LLMs), Bayesian optimization, and an active learning loop.
Our methodology combines advanced language understanding with robust optimization strategies, effectively translating knowledge extracted from diverse literature into actionable parameters.
The results underscore the workflow's ability to streamline the catalyst development process, offering a swift, resource-efficient, and high-precision alternative to conventional methods.
- Score: 4.192356938537922
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In the pursuit of novel catalyst development to address pressing
environmental concerns and energy demand, conventional design and optimization
methods often fall short due to the complexity and vastness of the catalyst
parameter space. The advent of Machine Learning (ML) has ushered in a new era
in the field of catalyst optimization, offering potential solutions to the
shortcomings of traditional techniques. However, existing methods fail to
effectively harness the wealth of information contained within the burgeoning
body of scientific literature on catalyst synthesis. To address this gap, this
study proposes an innovative Artificial Intelligence (AI) workflow that
integrates Large Language Models (LLMs), Bayesian optimization, and an active
learning loop to expedite and enhance catalyst optimization. Our methodology
combines advanced language understanding with robust optimization strategies,
effectively translating knowledge extracted from diverse literature into
actionable parameters for practical experimentation and optimization. In this
article, we demonstrate the application of this AI workflow in the optimization
of catalyst synthesis for ammonia production. The results underscore the
workflow's ability to streamline the catalyst development process, offering a
swift, resource-efficient, and high-precision alternative to conventional
methods.
Related papers
- Optimization Strategies for Enhancing Resource Efficiency in Transformers & Large Language Models [0.0]
This study explores optimization techniques, including Quantization, Knowledge Distillation, and Pruning.
4-bit Quantization significantly reduces energy use with minimal accuracy loss.
Hybrid approaches, like NVIDIA's Minitron approach combining KD and Structured Pruning, further demonstrate promising trade-offs between size reduction and accuracy retention.
arXiv Detail & Related papers (2025-01-16T08:54:44Z) - A Survey on Inference Optimization Techniques for Mixture of Experts Models [50.40325411764262]
Large-scale Mixture of Experts (MoE) models offer enhanced model capacity and computational efficiency through conditional computation.
deploying and running inference on these models presents significant challenges in computational resources, latency, and energy efficiency.
This survey analyzes optimization techniques for MoE models across the entire system stack.
arXiv Detail & Related papers (2024-12-18T14:11:15Z) - Synergistic Development of Perovskite Memristors and Algorithms for Robust Analog Computing [53.77822620185878]
We propose a synergistic methodology to concurrently optimize perovskite memristor fabrication and develop robust analog DNNs.
We develop "BayesMulti", a training strategy utilizing BO-guided noise injection to improve the resistance of analog DNNs to memristor imperfections.
Our integrated approach enables use of analog computing in much deeper and wider networks, achieving up to 100-fold improvements.
arXiv Detail & Related papers (2024-12-03T19:20:08Z) - Large Language Model as a Catalyst: A Paradigm Shift in Base Station Siting Optimization [62.16747639440893]
Large language models (LLMs) and their associated technologies advance, particularly in the realms of prompt engineering and agent engineering.
Our proposed framework incorporates retrieval-augmented generation (RAG) to enhance the system's ability to acquire domain-specific knowledge and generate solutions.
arXiv Detail & Related papers (2024-08-07T08:43:32Z) - Inference Optimization of Foundation Models on AI Accelerators [68.24450520773688]
Powerful foundation models, including large language models (LLMs), with Transformer architectures have ushered in a new era of Generative AI.
As the number of model parameters reaches to hundreds of billions, their deployment incurs prohibitive inference costs and high latency in real-world scenarios.
This tutorial offers a comprehensive discussion on complementary inference optimization techniques using AI accelerators.
arXiv Detail & Related papers (2024-07-12T09:24:34Z) - A Machine Learning and Explainable AI Framework Tailored for Unbalanced Experimental Catalyst Discovery [10.92613600218535]
We introduce a robust machine learning and explainable AI (XAI) framework to accurately classify the catalytic yield of various compositions.
This framework combines a series of ML practices designed to handle the scarcity and imbalance of catalyst data.
We believe that such insights can assist chemists in the development and identification of novel catalysts with superior performance.
arXiv Detail & Related papers (2024-07-10T13:09:53Z) - Leveraging Data Mining, Active Learning, and Domain Adaptation in a Multi-Stage, Machine Learning-Driven Approach for the Efficient Discovery of Advanced Acidic Oxygen Evolution Electrocatalysts [10.839705761909709]
This study introduces a novel, multi-stage machine learning (ML) approach to streamline the discovery and optimization of complex multi-metallic catalysts.
Our method integrates data mining, active learning, and domain adaptation throughout the materials discovery process.
arXiv Detail & Related papers (2024-07-05T22:14:55Z) - Adaptive Catalyst Discovery Using Multicriteria Bayesian Optimization with Representation Learning [17.00084254889438]
High-performance catalysts are crucial for sustainable energy conversion and human health.
The discovery of catalysts faces challenges due to the absence of efficient approaches to navigating vast and high-dimensional structure and composition spaces.
arXiv Detail & Related papers (2024-04-18T18:11:06Z) - Machine Learning Insides OptVerse AI Solver: Design Principles and
Applications [74.67495900436728]
We present a comprehensive study on the integration of machine learning (ML) techniques into Huawei Cloud's OptVerse AI solver.
We showcase our methods for generating complex SAT and MILP instances utilizing generative models that mirror multifaceted structures of real-world problem.
We detail the incorporation of state-of-the-art parameter tuning algorithms which markedly elevate solver performance.
arXiv Detail & Related papers (2024-01-11T15:02:15Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.