Expressivity of Parameterized and Data-driven Representations in Quality
Diversity Search
- URL: http://arxiv.org/abs/2105.04247v1
- Date: Mon, 10 May 2021 10:27:43 GMT
- Title: Expressivity of Parameterized and Data-driven Representations in Quality
Diversity Search
- Authors: Alexander Hagg, Sebastian Berns, Alexander Asteroth, Simon Colton,
Thomas B\"ack
- Abstract summary: We compare the output diversity of a quality diversity evolutionary search performed in two different search spaces.
A learned model is better at interpolating between known data points than at extrapolating or expanding towards unseen examples.
- Score: 111.06379262544911
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider multi-solution optimization and generative models for the
generation of diverse artifacts and the discovery of novel solutions. In cases
where the domain's factors of variation are unknown or too complex to encode
manually, generative models can provide a learned latent space to approximate
these factors. When used as a search space, however, the range and diversity of
possible outputs are limited to the expressivity and generative capabilities of
the learned model. We compare the output diversity of a quality diversity
evolutionary search performed in two different search spaces: 1) a predefined
parameterized space and 2) the latent space of a variational autoencoder model.
We find that the search on an explicit parametric encoding creates more diverse
artifact sets than searching the latent space. A learned model is better at
interpolating between known data points than at extrapolating or expanding
towards unseen examples. We recommend using a generative model's latent space
primarily to measure similarity between artifacts rather than for search and
generation. Whenever a parametric encoding is obtainable, it should be
preferred over a learned representation as it produces a higher diversity of
solutions.
Related papers
- GSSF: Generalized Structural Sparse Function for Deep Cross-modal Metric Learning [51.677086019209554]
We propose a Generalized Structural Sparse to capture powerful relationships across modalities for pair-wise similarity learning.
The distance metric delicately encapsulates two formats of diagonal and block-diagonal terms.
Experiments on cross-modal and two extra uni-modal retrieval tasks have validated its superiority and flexibility.
arXiv Detail & Related papers (2024-10-20T03:45:50Z) - Comparing the latent space of generative models [0.0]
Different encodings of datapoints in the latent space of latent-vector generative models may result in more or less effective and disentangled characterizations of the different explanatory factors of variation behind the data.
A simple linear mapping is enough to pass from a latent space to another while preserving most of the information.
arXiv Detail & Related papers (2022-07-14T10:39:02Z) - COIL: Constrained Optimization in Learned Latent Space -- Learning
Representations for Valid Solutions [4.372703857711996]
We use a Variational Autoencoder to learn representations of Constrained Optimization in Latent Space (COIL)
COIL can satisfy constraints and find solutions with distance to objective up to two orders of closer.
We show that, compared to an identical GA using a standard representation, COIL with its learned latent representation can satisfy constraints and find solutions with distance to objective up to two orders of closer.
arXiv Detail & Related papers (2022-02-04T14:45:37Z) - Massive-scale Decoding for Text Generation using Lattices [34.2658286826597]
We present a search algorithm to construct lattices encoding a massive number of generation options.
We show that our algorithm encodes hundreds to thousands of diverse options that remain grammatical and high-quality into one linear-sized lattice.
arXiv Detail & Related papers (2021-12-14T18:56:11Z) - Multidimensional Assignment Problem for multipartite entity resolution [69.48568967931608]
Multipartite entity resolution aims at integrating records from multiple datasets into one entity.
We apply two procedures, a Greedy algorithm and a large scale neighborhood search, to solve the assignment problem.
We find evidence that design-based multi-start can be more efficient as the size of databases grow large.
arXiv Detail & Related papers (2021-12-06T20:34:55Z) - Multimodal Data Fusion in High-Dimensional Heterogeneous Datasets via
Generative Models [16.436293069942312]
We are interested in learning probabilistic generative models from high-dimensional heterogeneous data in an unsupervised fashion.
We propose a general framework that combines disparate data types through the exponential family of distributions.
The proposed algorithm is presented in detail for the commonly encountered heterogeneous datasets with real-valued (Gaussian) and categorical (multinomial) features.
arXiv Detail & Related papers (2021-08-27T18:10:31Z) - Redefining Neural Architecture Search of Heterogeneous Multi-Network
Models by Characterizing Variation Operators and Model Components [71.03032589756434]
We investigate the effect of different variation operators in a complex domain, that of multi-network heterogeneous neural models.
We characterize both the variation operators, according to their effect on the complexity and performance of the model; and the models, relying on diverse metrics which estimate the quality of the different parts composing it.
arXiv Detail & Related papers (2021-06-16T17:12:26Z) - Conditional Generative Modeling via Learning the Latent Space [54.620761775441046]
We propose a novel framework for conditional generation in multimodal spaces.
It uses latent variables to model generalizable learning patterns.
At inference, the latent variables are optimized to find optimal solutions corresponding to multiple output modes.
arXiv Detail & Related papers (2020-10-07T03:11:34Z) - The data-driven physical-based equations discovery using evolutionary
approach [77.34726150561087]
We describe the algorithm for the mathematical equations discovery from the given observations data.
The algorithm combines genetic programming with the sparse regression.
It could be used for governing analytical equation discovery as well as for partial differential equations (PDE) discovery.
arXiv Detail & Related papers (2020-04-03T17:21:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.