Towards Reformulating Essence Specifications for Robustness
- URL: http://arxiv.org/abs/2111.00821v1
- Date: Mon, 1 Nov 2021 10:51:47 GMT
- Title: Towards Reformulating Essence Specifications for Robustness
- Authors: \"Ozg\"ur Akg\"un, Alan M. Frisch, Ian P. Gent, Christopher Jefferson,
Ian Miguel, Peter Nightingale, Andr\'as Z. Salamon
- Abstract summary: Essence is a rich language in which there are many equivalent ways to specify a given problem.
A user may omit the use of domain attributes or abstract types, resulting in fewer refinement rules being applicable.
This paper addresses the problem of recovering this information automatically to increase the robustness of the output constraint models.
- Score: 6.497578221372429
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The Essence language allows a user to specify a constraint problem at a level
of abstraction above that at which constraint modelling decisions are made.
Essence specifications are refined into constraint models using the Conjure
automated modelling tool, which employs a suite of refinement rules. However,
Essence is a rich language in which there are many equivalent ways to specify a
given problem. A user may therefore omit the use of domain attributes or
abstract types, resulting in fewer refinement rules being applicable and
therefore a reduced set of output models from which to select. This paper
addresses the problem of recovering this information automatically to increase
the robustness of the quality of the output constraint models in the face of
variation in the input Essence specification. We present reformulation rules
that can change the type of a decision variable or add attributes that shrink
its domain. We demonstrate the efficacy of this approach in terms of the
quantity and quality of models Conjure can produce from the transformed
specification compared with the original.
Related papers
- Athanor: Local Search over Abstract Constraint Specifications [2.3383199519492455]
We focus on general-purpose local search solvers that accept as input a constraint model.
The Athanor solver we describe herein differs in that it begins from a specification of a problem in the abstract constraint specification language Essence.
arXiv Detail & Related papers (2024-10-08T11:41:38Z) - Automatic Feature Learning for Essence: a Case Study on Car Sequencing [1.006631010704608]
We consider the task of building machine learning models to automatically select the best combination for a problem instance.
A critical part of the learning process is to define instance features, which serve as input to the selection model.
Our contribution is automatic learning of instance features directly from the high-level representation of a problem instance using a language model.
arXiv Detail & Related papers (2024-09-23T16:06:44Z) - Label-Efficient Model Selection for Text Generation [14.61636207880449]
We introduce DiffUse, a method to make an informed decision between candidate text generation models based on preference annotations.
In a series of experiments over hundreds of model pairs, we demonstrate that DiffUse can dramatically reduce the required number of annotations.
arXiv Detail & Related papers (2024-02-12T18:54:02Z) - On Regularization and Inference with Label Constraints [62.60903248392479]
We compare two strategies for encoding label constraints in a machine learning pipeline, regularization with constraints and constrained inference.
For regularization, we show that it narrows the generalization gap by precluding models that are inconsistent with the constraints.
For constrained inference, we show that it reduces the population risk by correcting a model's violation, and hence turns the violation into an advantage.
arXiv Detail & Related papers (2023-07-08T03:39:22Z) - CARE: Coherent Actionable Recourse based on Sound Counterfactual
Explanations [0.0]
This paper introduces CARE, a modular explanation framework that addresses the model- and user-level desiderata.
As a model-agnostic approach, CARE generates multiple, diverse explanations for any black-box model.
arXiv Detail & Related papers (2021-08-18T15:26:59Z) - Sufficiently Accurate Model Learning for Planning [119.80502738709937]
This paper introduces the constrained Sufficiently Accurate model learning approach.
It provides examples of such problems, and presents a theorem on how close some approximate solutions can be.
The approximate solution quality will depend on the function parameterization, loss and constraint function smoothness, and the number of samples in model learning.
arXiv Detail & Related papers (2021-02-11T16:27:31Z) - DirectDebug: Automated Testing and Debugging of Feature Models [55.41644538483948]
Variability models (e.g., feature models) are a common way for the representation of variabilities and commonalities of software artifacts.
Complex and often large-scale feature models can become faulty, i.e., do not represent the expected variability properties of the underlying software artifact.
arXiv Detail & Related papers (2021-02-11T11:22:20Z) - Towards Portfolios of Streamlined Constraint Models: A Case Study with
the Balanced Academic Curriculum Problem [1.8466814193413488]
We focus on the automatic addition of streamliner constraints, derived from the types present in an abstract Essence specification of a problem class of interest.
The refinement of streamlined Essence specifications into constraint models gives rise to a large number of modelling choices.
Various forms of racing are utilised to constrain the computational cost of training.
arXiv Detail & Related papers (2020-09-21T19:48:02Z) - Rewriting a Deep Generative Model [56.91974064348137]
We introduce a new problem setting: manipulation of specific rules encoded by a deep generative model.
We propose a formulation in which the desired rule is changed by manipulating a layer of a deep network as a linear associative memory.
We present a user interface to enable users to interactively change the rules of a generative model to achieve desired effects.
arXiv Detail & Related papers (2020-07-30T17:58:16Z) - Interpretable Entity Representations through Large-Scale Typing [61.4277527871572]
We present an approach to creating entity representations that are human readable and achieve high performance out of the box.
Our representations are vectors whose values correspond to posterior probabilities over fine-grained entity types.
We show that it is possible to reduce the size of our type set in a learning-based way for particular domains.
arXiv Detail & Related papers (2020-04-30T23:58:03Z) - Attribute-based Regularization of Latent Spaces for Variational
Auto-Encoders [79.68916470119743]
We present a novel method to structure the latent space of a Variational Auto-Encoder (VAE) to encode different continuous-valued attributes explicitly.
This is accomplished by using an attribute regularization loss which enforces a monotonic relationship between the attribute values and the latent code of the dimension along which the attribute is to be encoded.
arXiv Detail & Related papers (2020-04-11T20:53:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.