COIL: Constrained Optimization in Learned Latent Space -- Learning
Representations for Valid Solutions
- URL: http://arxiv.org/abs/2202.02163v2
- Date: Mon, 7 Feb 2022 11:16:23 GMT
- Title: COIL: Constrained Optimization in Learned Latent Space -- Learning
Representations for Valid Solutions
- Authors: Peter J Bentley, Soo Ling Lim, Adam Gaier and Linh Tran
- Abstract summary: We use a Variational Autoencoder to learn representations of Constrained Optimization in Latent Space (COIL)
COIL can satisfy constraints and find solutions with distance to objective up to two orders of closer.
We show that, compared to an identical GA using a standard representation, COIL with its learned latent representation can satisfy constraints and find solutions with distance to objective up to two orders of closer.
- Score: 4.372703857711996
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Constrained optimization problems can be difficult because their search
spaces have properties not conducive to search, e.g., multimodality,
discontinuities, or deception. To address such difficulties, considerable
research has been performed on creating novel evolutionary algorithms or
specialized genetic operators. However, if the representation that defined the
search space could be altered such that it only permitted valid solutions that
satisfied the constraints, the task of finding the optimal would be made more
feasible without any need for specialized optimization algorithms. We propose
the use of a Variational Autoencoder to learn such representations. We present
Constrained Optimization in Latent Space (COIL), which uses a VAE to generate a
learned latent representation from a dataset comprising samples from the valid
region of the search space according to a constraint, thus enabling the
optimizer to find the objective in the new space defined by the learned
representation. We investigate the value of this approach on different
constraint types and for different numbers of variables. We show that, compared
to an identical GA using a standard representation, COIL with its learned
latent representation can satisfy constraints and find solutions with distance
to objective up to two orders of magnitude closer.
Related papers
- On Probabilistic Embeddings in Optimal Dimension Reduction [1.2085509610251701]
Dimension reduction algorithms are a crucial part of many data science pipelines.
Despite their wide utilization, many non-linear dimension reduction algorithms are poorly understood from a theoretical perspective.
arXiv Detail & Related papers (2024-08-05T12:46:21Z) - Factorization of Multi-Agent Sampling-Based Motion Planning [72.42734061131569]
Modern robotics often involves multiple embodied agents operating within a shared environment.
Standard sampling-based algorithms can be used to search for solutions in the robots' joint space.
We integrate the concept of factorization into sampling-based algorithms, which requires only minimal modifications to existing methods.
We present a general implementation of a factorized SBA, derive an analytical gain in terms of sample complexity for PRM*, and showcase empirical results for RRG.
arXiv Detail & Related papers (2023-04-01T15:50:18Z) - On Differential Privacy and Adaptive Data Analysis with Bounded Space [76.10334958368618]
We study the space complexity of the two related fields of differential privacy and adaptive data analysis.
We show that there exists a problem P that requires exponentially more space to be solved efficiently with differential privacy.
The line of work on adaptive data analysis focuses on understanding the number of samples needed for answering a sequence of adaptive queries.
arXiv Detail & Related papers (2023-02-11T14:45:31Z) - Partial Optimality in Cubic Correlation Clustering [6.372499127940625]
Certifying optimality is NP-hard and practically hampered already by the complexity of the problem statement.
Here, we focus on establishing partial optimality conditions for the special case of complete graphs and cubic objective functions.
In addition, we define and implement algorithms for testing these conditions and examine their effect numerically, on two datasets.
arXiv Detail & Related papers (2023-02-09T15:25:52Z) - Symmetric Tensor Networks for Generative Modeling and Constrained
Combinatorial Optimization [72.41480594026815]
Constrained optimization problems abound in industry, from portfolio optimization to logistics.
One of the major roadblocks in solving these problems is the presence of non-trivial hard constraints which limit the valid search space.
In this work, we encode arbitrary integer-valued equality constraints of the form Ax=b, directly into U(1) symmetric networks (TNs) and leverage their applicability as quantum-inspired generative models.
arXiv Detail & Related papers (2022-11-16T18:59:54Z) - Efficient Joint-Dimensional Search with Solution Space Regularization
for Real-Time Semantic Segmentation [27.94898516315886]
We search an optimal network structure that can run in real-time for this problem.
A novel Solution Space Regularization (SSR) loss is first proposed to effectively encourage the supernet to converge to its discrete one.
A new Hierarchical and Progressive Solution Space Shrinking method is presented to further achieve high efficiency of searching.
arXiv Detail & Related papers (2022-08-10T11:07:33Z) - Learning Proximal Operators to Discover Multiple Optima [66.98045013486794]
We present an end-to-end method to learn the proximal operator across non-family problems.
We show that for weakly-ized objectives and under mild conditions, the method converges globally.
arXiv Detail & Related papers (2022-01-28T05:53:28Z) - Expressivity of Parameterized and Data-driven Representations in Quality
Diversity Search [111.06379262544911]
We compare the output diversity of a quality diversity evolutionary search performed in two different search spaces.
A learned model is better at interpolating between known data points than at extrapolating or expanding towards unseen examples.
arXiv Detail & Related papers (2021-05-10T10:27:43Z) - BOP-Elites, a Bayesian Optimisation algorithm for Quality-Diversity
search [0.0]
We propose the Bayesian optimisation of Elites (BOP-Elites) algorithm.
By considering user defined regions of the feature space as 'niches' our task is to find the optimal solution in each niche.
The resulting algorithm is very effective in identifying the parts of the search space that belong to a niche in feature space, and finding the optimal solution in each niche.
arXiv Detail & Related papers (2020-05-08T23:49:13Z) - Discovering Representations for Black-box Optimization [73.59962178534361]
We show that black-box optimization encodings can be automatically learned, rather than hand designed.
We show that learned representations make it possible to solve high-dimensional problems with orders of magnitude fewer evaluations than the standard MAP-Elites.
arXiv Detail & Related papers (2020-03-09T20:06:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.