Portfolio optimization with discrete simulated annealing
- URL: http://arxiv.org/abs/2210.00807v1
- Date: Mon, 3 Oct 2022 10:39:05 GMT
- Title: Portfolio optimization with discrete simulated annealing
- Authors: \'Alvaro Rubio-Garc\'ia and Juan Jos\'e Garc\'ia-Ripoll and Diego
Porras
- Abstract summary: We present an integer optimization method to find optimal portfolios in the presence of discretized convex and non- convex cost functions.
This allows us to achieve a solution with a given quality.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Portfolio optimization is an important process in finance that consists in
finding the optimal asset allocation that maximizes expected returns while
minimizing risk. When assets are allocated in discrete units, this is a
combinatorial optimization problem that can be addressed by quantum and
quantum-inspired algorithms. In this work we present an integer simulated
annealing method to find optimal portfolios in the presence of discretized
convex and non-convex cost functions. Our algorithm can deal with large size
portfolios with hundreds of assets. We introduce a performance metric, the time
to target, based on a lower bound to the cost function obtained with the
continuous relaxation of the combinatorial optimization problem. This metric
allows us to quantify the time required to achieve a solution with a given
quality. We carry out numerical experiments and we benchmark the algorithm in
two situations: (i) Monte Carlo instances are started at random, and (ii) the
algorithm is warm-started with an initial instance close to the continuous
relaxation of the problem. We find that in the case of warm-starting with
convex cost functions, the time to target does not grow with the size of the
optimization problem, so discretized versions of convex portfolio optimization
problems are not hard to solve using classical resources. We have applied our
method to the problem of re-balancing in the presence of non-convex transaction
costs, and we have found that our algorithm can efficiently minimize those
terms.
Related papers
- First-Order Dynamic Optimization for Streaming Convex Costs [0.0]
We develop an approach to track the optimal solution with a bounded error.
Our algorithm is executed only by using the first-order derivatives of the cost function.
arXiv Detail & Related papers (2023-10-11T22:41:00Z) - Accelerated First-Order Optimization under Nonlinear Constraints [73.2273449996098]
We exploit between first-order algorithms for constrained optimization and non-smooth systems to design a new class of accelerated first-order algorithms.
An important property of these algorithms is that constraints are expressed in terms of velocities instead of sparse variables.
arXiv Detail & Related papers (2023-02-01T08:50:48Z) - Outlier-Robust Sparse Estimation via Non-Convex Optimization [73.18654719887205]
We explore the connection between high-dimensional statistics and non-robust optimization in the presence of sparsity constraints.
We develop novel and simple optimization formulations for these problems.
As a corollary, we obtain that any first-order method that efficiently converges to station yields an efficient algorithm for these tasks.
arXiv Detail & Related papers (2021-09-23T17:38:24Z) - Grouped Variable Selection with Discrete Optimization: Computational and
Statistical Perspectives [9.593208961737572]
We present a new algorithmic framework for grouped variable selection that is based on discrete mathematical optimization.
Our methodology covers both high-dimensional linear regression and non- additive modeling with sparse programming.
Our exact algorithm is based on a standalone branch-and-bound (BnB) framework, which can solve the associated mixed integer (MIP) problem to certified optimality.
arXiv Detail & Related papers (2021-04-14T19:21:59Z) - Adaptive Sampling for Best Policy Identification in Markov Decision
Processes [79.4957965474334]
We investigate the problem of best-policy identification in discounted Markov Decision (MDPs) when the learner has access to a generative model.
The advantages of state-of-the-art algorithms are discussed and illustrated.
arXiv Detail & Related papers (2020-09-28T15:22:24Z) - Warm-starting quantum optimization [6.832341432995627]
We discuss how to warm-start quantum optimization with an initial state corresponding to the solution of a relaxation of an optimization problem.
This allows the quantum algorithm to inherit the performance guarantees of the classical algorithm.
It is straightforward to apply the same ideas to other randomized-rounding schemes and optimization problems.
arXiv Detail & Related papers (2020-09-21T18:00:09Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z) - Private Stochastic Convex Optimization: Optimal Rates in Linear Time [74.47681868973598]
We study the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions.
A recent work of Bassily et al. has established the optimal bound on the excess population loss achievable given $n$ samples.
We describe two new techniques for deriving convex optimization algorithms both achieving the optimal bound on excess loss and using $O(minn, n2/d)$ gradient computations.
arXiv Detail & Related papers (2020-05-10T19:52:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.