Symmetric Tensor Networks for Generative Modeling and Constrained
Combinatorial Optimization
- URL: http://arxiv.org/abs/2211.09121v3
- Date: Sat, 8 Jul 2023 11:11:36 GMT
- Title: Symmetric Tensor Networks for Generative Modeling and Constrained
Combinatorial Optimization
- Authors: Javier Lopez-Piqueres, Jing Chen, Alejandro Perdomo-Ortiz
- Abstract summary: Constrained optimization problems abound in industry, from portfolio optimization to logistics.
One of the major roadblocks in solving these problems is the presence of non-trivial hard constraints which limit the valid search space.
In this work, we encode arbitrary integer-valued equality constraints of the form Ax=b, directly into U(1) symmetric networks (TNs) and leverage their applicability as quantum-inspired generative models.
- Score: 72.41480594026815
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Constrained combinatorial optimization problems abound in industry, from
portfolio optimization to logistics. One of the major roadblocks in solving
these problems is the presence of non-trivial hard constraints which limit the
valid search space. In some heuristic solvers, these are typically addressed by
introducing certain Lagrange multipliers in the cost function, by relaxing them
in some way, or worse yet, by generating many samples and only keeping valid
ones, which leads to very expensive and inefficient searches. In this work, we
encode arbitrary integer-valued equality constraints of the form Ax=b, directly
into U(1) symmetric tensor networks (TNs) and leverage their applicability as
quantum-inspired generative models to assist in the search of solutions to
combinatorial optimization problems. This allows us to exploit the
generalization capabilities of TN generative models while constraining them so
that they only output valid samples. Our constrained TN generative model
efficiently captures the constraints by reducing number of parameters and
computational costs. We find that at tasks with constraints given by arbitrary
equalities, symmetric Matrix Product States outperform their standard
unconstrained counterparts at finding novel and better solutions to
combinatorial optimization problems.
Related papers
- Cons-training tensor networks [2.8834278113855896]
We introduce a novel family of tensor networks, termed.
textitconstrained matrix product states (MPS)
These networks incorporate exactly arbitrary discrete linear constraints, including inequalities, into sparse block structures.
These networks are particularly tailored for modeling distributions with support strictly over the feasible space.
arXiv Detail & Related papers (2024-05-15T00:13:18Z) - Multi-Agent Bayesian Optimization with Coupled Black-Box and Affine
Constraints [21.38692458445459]
We study the problem of distributed multi-agent Bayesian optimization with both coupled black-box constraints and known affine constraints.
A primal-dual distributed algorithm is proposed that achieves similar regret/violation bounds as those in the single-agent case.
arXiv Detail & Related papers (2023-10-02T08:07:36Z) - Stochastic Inexact Augmented Lagrangian Method for Nonconvex Expectation
Constrained Optimization [88.0031283949404]
Many real-world problems have complicated non functional constraints and use a large number of data points.
Our proposed method outperforms an existing method with the previously best-known result.
arXiv Detail & Related papers (2022-12-19T14:48:54Z) - Polynomial Optimization: Enhancing RLT relaxations with Conic
Constraints [0.0]
Conic optimization has emerged as a powerful tool for designing tractable and guaranteed algorithms for non-scale problems.
We investigate the strengthening of the RLT relaxations of optimization problems through the addition of nine different types of constraints.
We show how to design these variants and their performance with respect to each other and with respect to the standard RLT relaxations.
arXiv Detail & Related papers (2022-08-11T02:13:04Z) - Constrained mixers for the quantum approximate optimization algorithm [55.41644538483948]
We present a framework for constructing mixing operators that restrict the evolution to a subspace of the full Hilbert space.
We generalize the "XY"-mixer designed to preserve the subspace of "one-hot" states to the general case of subspaces given by a number of computational basis states.
Our analysis also leads to valid Trotterizations for "XY"-mixer with fewer CX gates than is known to date.
arXiv Detail & Related papers (2022-03-11T17:19:26Z) - Faster Algorithm and Sharper Analysis for Constrained Markov Decision
Process [56.55075925645864]
The problem of constrained decision process (CMDP) is investigated, where an agent aims to maximize the expected accumulated discounted reward subject to multiple constraints.
A new utilities-dual convex approach is proposed with novel integration of three ingredients: regularized policy, dual regularizer, and Nesterov's gradient descent dual.
This is the first demonstration that nonconcave CMDP problems can attain the lower bound of $mathcal O (1/epsilon)$ for all complexity optimization subject to convex constraints.
arXiv Detail & Related papers (2021-10-20T02:57:21Z) - Offline Model-Based Optimization via Normalized Maximum Likelihood
Estimation [101.22379613810881]
We consider data-driven optimization problems where one must maximize a function given only queries at a fixed set of points.
This problem setting emerges in many domains where function evaluation is a complex and expensive process.
We propose a tractable approximation that allows us to scale our method to high-capacity neural network models.
arXiv Detail & Related papers (2021-02-16T06:04:27Z) - A Hybrid Framework Using a QUBO Solver For Permutation-Based
Combinatorial Optimization [5.460573052311485]
We propose a hybrid framework to solve large-scale permutation-based problems using a high-performance quadratic unconstrained binary optimization solver.
We propose techniques to overcome the challenges in using a QUBO solver that typically comes with limited numbers of bits.
arXiv Detail & Related papers (2020-09-27T07:15:25Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Hybrid Variance-Reduced SGD Algorithms For Nonconvex-Concave Minimax
Problems [26.24895953952318]
We develop an algorithm to solve a class of non-gence minimax problems.
They can also work with both single or two mini-batch derivatives.
arXiv Detail & Related papers (2020-06-27T03:05:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.