Conservative Stochastic Optimization with Expectation Constraints
- URL: http://arxiv.org/abs/2008.05758v2
- Date: Sat, 29 May 2021 05:32:29 GMT
- Title: Conservative Stochastic Optimization with Expectation Constraints
- Authors: Zeeshan Akhtar, Amrit Singh Bedi, and Ketan Rajawat
- Abstract summary: This paper considers convex optimization problems where the objective and constraint functions involve expectations with respect to the data indices or environmental variables.
Online and efficient approaches for solving such problems have not been widely studied.
We propose a novel conservative optimization algorithm (CSOA) that achieves zero constraint violation and $Oleft(T-frac12right)$ optimality gap.
- Score: 11.393603788068777
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper considers stochastic convex optimization problems where the
objective and constraint functions involve expectations with respect to the
data indices or environmental variables, in addition to deterministic convex
constraints on the domain of the variables. Although the setting is generic and
arises in different machine learning applications, online and efficient
approaches for solving such problems have not been widely studied. Since the
underlying data distribution is unknown a priori, a closed-form solution is
generally not available, and classical deterministic optimization paradigms are
not applicable. State-of-the-art approaches, such as those using the saddle
point framework, can ensure that the optimality gap as well as the constraint
violation decay as $\O\left(T^{-\frac{1}{2}}\right)$ where $T$ is the number of
stochastic gradients. The domain constraints are assumed simple and handled via
projection at every iteration. In this work, we propose a novel conservative
stochastic optimization algorithm (CSOA) that achieves zero constraint
violation and $\O\left(T^{-\frac{1}{2}}\right)$ optimality gap.
Further, the projection operation (for scenarios when calculating projection
is expensive) in the proposed algorithm can be avoided by considering the
conditional gradient or Frank-Wolfe (FW) variant of the algorithm. The
state-of-the-art stochastic FW variants achieve an optimality gap of
$\O\left(T^{-\frac{1}{3}}\right)$ after $T$ iterations, though these algorithms
have not been applied to problems with functional expectation constraints. In
this work, we propose the FW-CSOA algorithm that is not only projection-free
but also achieves zero constraint violation with
$\O\left(T^{-\frac{1}{4}}\right)$ decay of the optimality gap. The efficacy of
the proposed algorithms is tested on two relevant problems: fair classification
and structured matrix completion.
Related papers
- Nonsmooth Projection-Free Optimization with Functional Constraints [12.20060970177791]
This paper presents a subgradient-based algorithm for constrained nonsmooth convex computation.
Our proposed algorithm can handle nonsmooth problems with general convex functional inequality constraints.
Similar performance is observed when deterministic subgradients are replaced with subgradients.
arXiv Detail & Related papers (2023-11-18T23:06:33Z) - An Oblivious Stochastic Composite Optimization Algorithm for Eigenvalue
Optimization Problems [76.2042837251496]
We introduce two oblivious mirror descent algorithms based on a complementary composite setting.
Remarkably, both algorithms work without prior knowledge of the Lipschitz constant or smoothness of the objective function.
We show how to extend our framework to scale and demonstrate the efficiency and robustness of our methods on large scale semidefinite programs.
arXiv Detail & Related papers (2023-06-30T08:34:29Z) - Accelerated First-Order Optimization under Nonlinear Constraints [73.2273449996098]
We exploit between first-order algorithms for constrained optimization and non-smooth systems to design a new class of accelerated first-order algorithms.
An important property of these algorithms is that constraints are expressed in terms of velocities instead of sparse variables.
arXiv Detail & Related papers (2023-02-01T08:50:48Z) - Stochastic Inexact Augmented Lagrangian Method for Nonconvex Expectation
Constrained Optimization [88.0031283949404]
Many real-world problems have complicated non functional constraints and use a large number of data points.
Our proposed method outperforms an existing method with the previously best-known result.
arXiv Detail & Related papers (2022-12-19T14:48:54Z) - A Projection-free Algorithm for Constrained Stochastic Multi-level
Composition Optimization [12.096252285460814]
We propose a projection-free conditional gradient-type algorithm for composition optimization.
We show that the number of oracles and the linear-minimization oracle required by the proposed algorithm, are of order $mathcalO_T(epsilon-2)$ and $mathcalO_T(epsilon-3)$ respectively.
arXiv Detail & Related papers (2022-02-09T06:05:38Z) - Zeroth and First Order Stochastic Frank-Wolfe Algorithms for Constrained
Optimization [13.170519806372075]
Problems of convex optimization with two sets of constraints arise frequently in the context of semidefinite programming.
Since projection onto the first set of constraints is difficult, it becomes necessary to explore projection-free algorithms.
The efficacy of the proposed algorithms is tested on relevant applications of sparse matrix estimation, clustering via semidefinite relaxation, and uniform sparsest cut problem.
arXiv Detail & Related papers (2021-07-14T08:01:30Z) - A stochastic linearized proximal method of multipliers for convex
stochastic optimization with expectation constraints [8.133190610747974]
We present a computable approximation type algorithm, namely the linearized proximal convex method of multipliers.
Some preliminary numerical results demonstrate the performance of the proposed algorithm.
arXiv Detail & Related papers (2021-06-22T07:24:17Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - A Two-Timescale Framework for Bilevel Optimization: Complexity Analysis
and Application to Actor-Critic [142.1492359556374]
Bilevel optimization is a class of problems which exhibit a two-level structure.
We propose a two-timescale approximation (TTSA) algorithm for tackling such a bilevel problem.
We show that a two-timescale natural actor-critic policy optimization algorithm can be viewed as a special case of our TTSA framework.
arXiv Detail & Related papers (2020-07-10T05:20:02Z) - Private Stochastic Convex Optimization: Optimal Rates in Linear Time [74.47681868973598]
We study the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions.
A recent work of Bassily et al. has established the optimal bound on the excess population loss achievable given $n$ samples.
We describe two new techniques for deriving convex optimization algorithms both achieving the optimal bound on excess loss and using $O(minn, n2/d)$ gradient computations.
arXiv Detail & Related papers (2020-05-10T19:52:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.