Differentially Private Convex Optimization with Feasibility Guarantees
- URL: http://arxiv.org/abs/2006.12338v1
- Date: Mon, 22 Jun 2020 15:30:52 GMT
- Title: Differentially Private Convex Optimization with Feasibility Guarantees
- Authors: Vladimir Dvorkin and Ferdinando Fioretto and Pascal Van Hentenryck and
Jalal Kazempour and Pierre Pinson
- Abstract summary: This paper develops a novel differentially private framework to solve convex optimization problems.
The proposed framework provides a trade-off between the expected optimality loss and the variance of optimization results.
- Score: 44.36831037077509
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper develops a novel differentially private framework to solve convex
optimization problems with sensitive optimization data and complex physical or
operational constraints. Unlike standard noise-additive algorithms, that act
primarily on the problem data, objective or solution, and disregard the problem
constraints, this framework requires the optimization variables to be a
function of the noise and exploits a chance-constrained problem reformulation
with formal feasibility guarantees. The noise is calibrated to provide
differential privacy for identity and linear queries on the optimization
solution. For many applications, including resource allocation problems, the
proposed framework provides a trade-off between the expected optimality loss
and the variance of optimization results.
Related papers
- Learning Joint Models of Prediction and Optimization [56.04498536842065]
Predict-Then-Then framework uses machine learning models to predict unknown parameters of an optimization problem from features before solving.
This paper proposes an alternative method, in which optimal solutions are learned directly from the observable features by joint predictive models.
arXiv Detail & Related papers (2024-09-07T19:52:14Z) - Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - Optimizing Chance-Constrained Submodular Problems with Variable
Uncertainties [12.095075636344536]
We study chance-constrained submodular optimization problems, which capture a wide range of problems with constraints.
We present greedy algorithms that can obtain a high-quality solution, i.e., a constant approximation ratio to the given optimal solution.
arXiv Detail & Related papers (2023-09-23T04:48:49Z) - Multiobjective variational quantum optimization for constrained
problems: an application to Cash Management [45.82374977939355]
We introduce a new method for solving optimization problems with challenging constraints using variational quantum algorithms.
We test our proposal on a real-world problem with great relevance in finance: the Cash Management problem.
Our empirical results show a significant improvement in terms of the cost of the achieved solutions, but especially in the avoidance of local minima.
arXiv Detail & Related papers (2023-02-08T17:09:20Z) - Backpropagation of Unrolled Solvers with Folded Optimization [55.04219793298687]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver.
This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation.
arXiv Detail & Related papers (2023-01-28T01:50:42Z) - Bring Your Own Algorithm for Optimal Differentially Private Stochastic
Minimax Optimization [44.52870407321633]
holy grail of these settings is to guarantee the optimal trade-off between the privacy and the excess population loss.
We provide a general framework for solving differentially private minimax optimization (DP-SMO) problems.
Our framework is inspired from the recently proposed Phased-ERM method [20] for nonsmooth differentially private convex optimization (DP-SCO)
arXiv Detail & Related papers (2022-06-01T10:03:20Z) - Heuristic Strategies for Solving Complex Interacting Stockpile Blending
Problem with Chance Constraints [14.352521012951865]
In this paper, we consider the uncertainty in material grades and introduce chance constraints that are used to ensure the constraints with high confidence.
To address the stockpile blending problem with chance constraints, we propose a differential evolution algorithm combining two repair operators.
arXiv Detail & Related papers (2021-02-10T07:56:18Z) - Automatically Learning Compact Quality-aware Surrogates for Optimization
Problems [55.94450542785096]
Solving optimization problems with unknown parameters requires learning a predictive model to predict the values of the unknown parameters and then solving the problem using these values.
Recent work has shown that including the optimization problem as a layer in a complex training model pipeline results in predictions of iteration of unobserved decision making.
We show that we can improve solution quality by learning a low-dimensional surrogate model of a large optimization problem.
arXiv Detail & Related papers (2020-06-18T19:11:54Z) - Bilevel Optimization for Differentially Private Optimization in Energy
Systems [53.806512366696275]
This paper studies how to apply differential privacy to constrained optimization problems whose inputs are sensitive.
The paper shows that, under a natural assumption, a bilevel model can be solved efficiently for large-scale nonlinear optimization problems.
arXiv Detail & Related papers (2020-01-26T20:15:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.