Machine Learning Insides OptVerse AI Solver: Design Principles and
Applications
- URL: http://arxiv.org/abs/2401.05960v2
- Date: Wed, 17 Jan 2024 13:26:09 GMT
- Title: Machine Learning Insides OptVerse AI Solver: Design Principles and
Applications
- Authors: Xijun Li, Fangzhou Zhu, Hui-Ling Zhen, Weilin Luo, Meng Lu, Yimin
Huang, Zhenan Fan, Zirui Zhou, Yufei Kuang, Zhihai Wang, Zijie Geng, Yang Li,
Haoyang Liu, Zhiwu An, Muming Yang, Jianshu Li, Jie Wang, Junchi Yan, Defeng
Sun, Tao Zhong, Yong Zhang, Jia Zeng, Mingxuan Yuan, Jianye Hao, Jun Yao, Kun
Mao
- Abstract summary: We present a comprehensive study on the integration of machine learning (ML) techniques into Huawei Cloud's OptVerse AI solver.
We showcase our methods for generating complex SAT and MILP instances utilizing generative models that mirror multifaceted structures of real-world problem.
We detail the incorporation of state-of-the-art parameter tuning algorithms which markedly elevate solver performance.
- Score: 74.67495900436728
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In an era of digital ubiquity, efficient resource management and
decision-making are paramount across numerous industries. To this end, we
present a comprehensive study on the integration of machine learning (ML)
techniques into Huawei Cloud's OptVerse AI Solver, which aims to mitigate the
scarcity of real-world mathematical programming instances, and to surpass the
capabilities of traditional optimization techniques. We showcase our methods
for generating complex SAT and MILP instances utilizing generative models that
mirror multifaceted structures of real-world problem. Furthermore, we introduce
a training framework leveraging augmentation policies to maintain solvers'
utility in dynamic environments. Besides the data generation and augmentation,
our proposed approaches also include novel ML-driven policies for personalized
solver strategies, with an emphasis on applications like graph convolutional
networks for initial basis selection and reinforcement learning for advanced
presolving and cut selection. Additionally, we detail the incorporation of
state-of-the-art parameter tuning algorithms which markedly elevate solver
performance. Compared with traditional solvers such as Cplex and SCIP, our
ML-augmented OptVerse AI Solver demonstrates superior speed and precision
across both established benchmarks and real-world scenarios, reinforcing the
practical imperative and effectiveness of machine learning techniques in
mathematical programming solvers.
Related papers
- Beyond Linear Approximations: A Novel Pruning Approach for Attention Matrix [17.086679273053853]
Large Language Models (LLMs) have shown immense potential in enhancing various aspects of our daily lives.
Their growing capabilities come at the cost of extremely large model sizes, making deployment on edge devices challenging.
This paper introduces a novel approach to LLM weight pruning that directly optimize for approximating the attention matrix.
arXiv Detail & Related papers (2024-10-15T04:35:56Z) - When Large Language Model Meets Optimization [7.822833805991351]
Large language models (LLMs) facilitate intelligent modeling and strategic decision-making in optimization.
This review outlines the progress and potential of combining LLMs with optimization algorithms.
arXiv Detail & Related papers (2024-05-16T13:54:37Z) - Machine Learning Augmented Branch and Bound for Mixed Integer Linear
Programming [11.293025183996832]
Mixed Linear Programming (MILP) offers a powerful modeling language for a wide range of applications.
In recent years, there has been an explosive development in the use of machine learning algorithms for enhancing all main tasks involved in the branch-and-bound algorithm.
In particular, we give detailed attention to machine learning algorithms that automatically optimize some metric of branch-and-bound efficiency.
arXiv Detail & Related papers (2024-02-08T09:19:26Z) - Towards Efficient Generative Large Language Model Serving: A Survey from
Algorithms to Systems [14.355768064425598]
generative large language models (LLMs) stand at the forefront, revolutionizing how we interact with our data.
However, the computational intensity and memory consumption of deploying these models present substantial challenges in terms of serving efficiency.
This survey addresses the imperative need for efficient LLM serving methodologies from a machine learning system (MLSys) research perspective.
arXiv Detail & Related papers (2023-12-23T11:57:53Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Reconfigurable Intelligent Surface Assisted Mobile Edge Computing with
Heterogeneous Learning Tasks [53.1636151439562]
Mobile edge computing (MEC) provides a natural platform for AI applications.
We present an infrastructure to perform machine learning tasks at an MEC with the assistance of a reconfigurable intelligent surface (RIS)
Specifically, we minimize the learning error of all participating users by jointly optimizing transmit power of mobile users, beamforming vectors of the base station, and the phase-shift matrix of the RIS.
arXiv Detail & Related papers (2020-12-25T07:08:50Z) - AI-based Modeling and Data-driven Evaluation for Smart Manufacturing
Processes [56.65379135797867]
We propose a dynamic algorithm for gaining useful insights about semiconductor manufacturing processes.
We elaborate on the utilization of a Genetic Algorithm and Neural Network to propose an intelligent feature selection algorithm.
arXiv Detail & Related papers (2020-08-29T14:57:53Z) - Optimization-driven Machine Learning for Intelligent Reflecting Surfaces
Assisted Wireless Networks [82.33619654835348]
Intelligent surface (IRS) has been employed to reshape the wireless channels by controlling individual scattering elements' phase shifts.
Due to the large size of scattering elements, the passive beamforming is typically challenged by the high computational complexity.
In this article, we focus on machine learning (ML) approaches for performance in IRS-assisted wireless networks.
arXiv Detail & Related papers (2020-08-29T08:39:43Z) - A Survey on Large-scale Machine Learning [67.6997613600942]
Machine learning can provide deep insights into data, allowing machines to make high-quality predictions.
Most sophisticated machine learning approaches suffer from huge time costs when operating on large-scale data.
Large-scale Machine Learning aims to learn patterns from big data with comparable performance efficiently.
arXiv Detail & Related papers (2020-08-10T06:07:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.