P-split formulations: A class of intermediate formulations between big-M and convex hull for disjunctive constraints
- URL: http://arxiv.org/abs/2202.05198v2
- Date: Mon, 27 May 2024 12:41:17 GMT
- Title: P-split formulations: A class of intermediate formulations between big-M and convex hull for disjunctive constraints
- Authors: Jan Kronqvist, Ruth Misener, Calvin Tsay,
- Abstract summary: "P-split" formulations are derived for disjunctive constraints with convex constraints within each disjuct.
We show that the P-split formulations form a hierarchy starting on a big-M equivalent and converging to the convex hull.
We computationally compare the P-split formulations against big-M and convex hull formulations on 344 test instances.
- Score: 6.258016078705562
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop a class of mixed-integer formulations for disjunctive constraints intermediate to the big-M and convex hull formulations in terms of relaxation strength. The main idea is to capture the best of both the big-M and convex hull formulations: a computationally light formulation with a tight relaxation. The "P-split" formulations are based on a lifted transformation that splits convex additively separable constraints into P partitions and forms the convex hull of the linearized and partitioned disjunction. The "P-split" formulations are derived for disjunctive constraints with convex constraints within each disjuct, and we generalize the results for the case with nonconvex constraints within the disjuncts. We analyze the continuous relaxation of the P-split formulations and show that, under certain assumptions, the formulations form a hierarchy starting from a big-M equivalent and converging to the convex hull. The goal of the P-split formulations is to form strong approximations of the convex hull through a computationally simpler formulation. We computationally compare the P-split formulations against big-M and convex hull formulations on 344 test instances. The test problems include K-means clustering, semi-supervised clustering, P_ball problems, and optimization over trained ReLU neural networks. The computational results show promising potential of the P-split formulations. For many of the test problems, P-split formulations are solved with a similar number of explored nodes as the convex hull formulation, while reducing the solution time by an order of magnitude and outperforming big-M both in time and number of explored nodes.
Related papers
- Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Targeted Separation and Convergence with Kernel Discrepancies [61.973643031360254]
kernel-based discrepancy measures are required to (i) separate a target P from other probability measures or (ii) control weak convergence to P.
In this article we derive new sufficient and necessary conditions to ensure (i) and (ii)
For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels.
arXiv Detail & Related papers (2022-09-26T16:41:16Z) - Faster One-Sample Stochastic Conditional Gradient Method for Composite
Convex Minimization [61.26619639722804]
We propose a conditional gradient method (CGM) for minimizing convex finite-sum objectives formed as a sum of smooth and non-smooth terms.
The proposed method, equipped with an average gradient (SAG) estimator, requires only one sample per iteration. Nevertheless, it guarantees fast convergence rates on par with more sophisticated variance reduction techniques.
arXiv Detail & Related papers (2022-02-26T19:10:48Z) - On the Complexity of a Practical Primal-Dual Coordinate Method [63.899427212054995]
We prove complexity bounds for primal-dual algorithm with random and coordinate descent (PURE-CD)
It has been shown to obtain good extrapolation for solving bi-max performance problems.
arXiv Detail & Related papers (2022-01-19T16:14:27Z) - Posterior-Aided Regularization for Likelihood-Free Inference [23.708122045184698]
Posterior-Aided Regularization (PAR) is applicable to learning the density estimator, regardless of the model structure.
We provide a unified estimation method of PAR to estimate both reverse KL term and mutual information term with a single neural network.
arXiv Detail & Related papers (2021-02-15T16:59:30Z) - Partition-based formulations for mixed-integer optimization of trained
ReLU neural networks [66.88252321870085]
This paper introduces a class of mixed-integer formulations for trained ReLU neural networks.
At one extreme, one partition per input recovers the convex hull of a node, i.e., the tightest possible formulation for each node.
arXiv Detail & Related papers (2021-02-08T17:27:34Z) - Between steps: Intermediate relaxations between big-M and convex hull
formulations [4.769747792846004]
We develop a class of relaxations in between the big-M and convex hull formulations of disjunctions.
The proposed "P-split" formulations split convex additively separable constraints into P partitions and form the convex hull of the disjuncts.
arXiv Detail & Related papers (2021-01-29T18:03:15Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - MINA: Convex Mixed-Integer Programming for Non-Rigid Shape Alignment [77.38594866794429]
convex mixed-integer programming formulation for non-rigid shape matching.
We propose a novel shape deformation model based on an efficient low-dimensional discrete model.
arXiv Detail & Related papers (2020-02-28T09:54:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.