Gumbel-softmax-based Optimization: A Simple General Framework for
Optimization Problems on Graphs
- URL: http://arxiv.org/abs/2004.07300v1
- Date: Tue, 14 Apr 2020 14:11:00 GMT
- Title: Gumbel-softmax-based Optimization: A Simple General Framework for
Optimization Problems on Graphs
- Authors: Yaoxin Li, Jing Liu, Guozheng Lin, Yueyuan Hou, Muyun Mou and Jiang
Zhang
- Abstract summary: We propose a simple, fast, and general algorithm framework based on advanced automatic differentiation technique empowered by deep learning frameworks.
High-quality solutions can be obtained with much less time consuming compared to traditional approaches.
- Score: 5.486093983007419
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In computer science, there exist a large number of optimization problems
defined on graphs, that is to find a best node state configuration or a network
structure such that the designed objective function is optimized under some
constraints. However, these problems are notorious for their hardness to solve
because most of them are NP-hard or NP-complete. Although traditional general
methods such as simulated annealing (SA), genetic algorithms (GA) and so forth
have been devised to these hard problems, their accuracy and time consumption
are not satisfying in practice. In this work, we proposed a simple, fast, and
general algorithm framework based on advanced automatic differentiation
technique empowered by deep learning frameworks. By introducing Gumbel-softmax
technique, we can optimize the objective function directly by gradient descent
algorithm regardless of the discrete nature of variables. We also introduce
evolution strategy to parallel version of our algorithm. We test our algorithm
on three representative optimization problems on graph including modularity
optimization from network science, Sherrington-Kirkpatrick (SK) model from
statistical physics, maximum independent set (MIS) and minimum vertex cover
(MVC) problem from combinatorial optimization on graph. High-quality solutions
can be obtained with much less time consuming compared to traditional
approaches.
Related papers
- Accelerated Gradient Algorithms with Adaptive Subspace Search for
Instance-Faster Optimization [6.896308995955336]
Gradient-based minimax optimal algorithms have promoted the development of continuous optimization and machine learning.
In this paper, we open up a new way to design and analyze gradient-based algorithms with direct applications in machine learning.
arXiv Detail & Related papers (2023-12-06T01:16:10Z) - Are Graph Neural Networks Optimal Approximation Algorithms? [26.5364112420121]
We design graph neural network architectures that capture optimal approximation algorithms for a class of optimization problems.
We take advantage of OptGNN's ability to capture convex relaxations to design an algorithm for producing bounds on the optimal solution from the learned embeddings of OptGNN.
arXiv Detail & Related papers (2023-10-01T00:12:31Z) - Accelerating Cutting-Plane Algorithms via Reinforcement Learning
Surrogates [49.84541884653309]
A current standard approach to solving convex discrete optimization problems is the use of cutting-plane algorithms.
Despite the existence of a number of general-purpose cut-generating algorithms, large-scale discrete optimization problems continue to suffer from intractability.
We propose a method for accelerating cutting-plane algorithms via reinforcement learning.
arXiv Detail & Related papers (2023-07-17T20:11:56Z) - An Accelerated Variance-Reduced Conditional Gradient Sliding Algorithm
for First-order and Zeroth-order Optimization [111.24899593052851]
Conditional gradient algorithm (also known as the Frank-Wolfe algorithm) has recently regained popularity in the machine learning community.
ARCS is the first zeroth-order conditional gradient sliding type algorithms solving convex problems in zeroth-order optimization.
In first-order optimization, the convergence results of ARCS substantially outperform previous algorithms in terms of the number of gradient query oracle.
arXiv Detail & Related papers (2021-09-18T07:08:11Z) - Combinatorial Optimization with Physics-Inspired Graph Neural Networks [0.0]
We show how graph neural networks can be used to solve optimization problems.
We find that the neural network performs on par or outperforms existing solvers.
arXiv Detail & Related papers (2021-07-02T16:54:35Z) - A Bi-Level Framework for Learning to Solve Combinatorial Optimization on
Graphs [91.07247251502564]
We propose a hybrid approach to combine the best of the two worlds, in which a bi-level framework is developed with an upper-level learning method to optimize the graph.
Such a bi-level approach simplifies the learning on the original hard CO and can effectively mitigate the demand for model capacity.
arXiv Detail & Related papers (2021-06-09T09:18:18Z) - Minimax Optimization with Smooth Algorithmic Adversaries [59.47122537182611]
We propose a new algorithm for the min-player against smooth algorithms deployed by an adversary.
Our algorithm is guaranteed to make monotonic progress having no limit cycles, and to find an appropriate number of gradient ascents.
arXiv Detail & Related papers (2021-06-02T22:03:36Z) - Towards Optimally Efficient Tree Search with Deep Learning [76.64632985696237]
This paper investigates the classical integer least-squares problem which estimates signals integer from linear models.
The problem is NP-hard and often arises in diverse applications such as signal processing, bioinformatics, communications and machine learning.
We propose a general hyper-accelerated tree search (HATS) algorithm by employing a deep neural network to estimate the optimal estimation for the underlying simplified memory-bounded A* algorithm.
arXiv Detail & Related papers (2021-01-07T08:00:02Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Improving the Quantum Approximate Optimization Algorithm with
postselection [0.0]
Combinatorial optimization is among the main applications envisioned for near-term and fault-tolerant quantum computers.
We consider a well-studied quantum algorithm for optimization: the Quantum Approximate Optimization Algorithm (QAOA) applied to the MaxCut problem on 3-regular graphs.
We derive theoretical upper and lower bounds showing that a constant (though small) increase of the fraction of satisfied edges is indeed achievable.
arXiv Detail & Related papers (2020-11-10T22:17:50Z) - Transferable Graph Optimizers for ML Compilers [18.353830282858834]
We propose an end-to-end, transferable deep reinforcement learning method for computational graph optimization (GO)
GO generates decisions on the entire graph rather than on each individual node autoregressively, drastically speeding up the search compared to prior methods.
GO achieves 21% improvement over human experts and 18% improvement over the prior state of the art with 15x faster convergence.
arXiv Detail & Related papers (2020-10-21T20:28:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.