An Integer Linear Programming Framework for Mining Constraints from Data
- URL: http://arxiv.org/abs/2006.10836v2
- Date: Fri, 11 Jun 2021 04:34:33 GMT
- Title: An Integer Linear Programming Framework for Mining Constraints from Data
- Authors: Tao Meng and Kai-Wei Chang
- Abstract summary: We present a general framework for mining constraints from data.
In particular, we consider the inference in structured output prediction as an integer linear programming (ILP) problem.
We show that our approach can learn to solve 9x9 Sudoku puzzles and minimal spanning tree problems from examples without providing the underlying rules.
- Score: 81.60135973848125
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Structured output prediction problems (e.g., sequential tagging, hierarchical
multi-class classification) often involve constraints over the output label
space. These constraints interact with the learned models to filter infeasible
solutions and facilitate in building an accountable system. However, although
constraints are useful, they are often based on hand-crafted rules. This raises
a question -- \emph{can we mine constraints and rules from data based on a
learning algorithm?}
In this paper, we present a general framework for mining constraints from
data. In particular, we consider the inference in structured output prediction
as an integer linear programming (ILP) problem. Then, given the coefficients of
the objective function and the corresponding solution, we mine the underlying
constraints by estimating the outer and inner polytopes of the feasible set. We
verify the proposed constraint mining algorithm in various synthetic and
real-world applications and demonstrate that the proposed approach successfully
identifies the feasible set at scale.
In particular, we show that our approach can learn to solve 9x9 Sudoku
puzzles and minimal spanning tree problems from examples without providing the
underlying rules. Our algorithm can also integrate with a neural network model
to learn the hierarchical label structure of a multi-label classification task.
Besides, we provide a theoretical analysis about the tightness of the polytopes
and the reliability of the mined constraints.
Related papers
- Semantic Objective Functions: A distribution-aware method for adding logical constraints in deep learning [4.854297874710511]
Constrained Learning and Knowledge Distillation techniques have shown promising results.
We propose a loss-based method that embeds knowledge-enforces logical constraints into a machine learning model.
We evaluate our method on a variety of learning tasks, including classification tasks with logic constraints.
arXiv Detail & Related papers (2024-05-03T19:21:47Z) - Robust Stochastically-Descending Unrolled Networks [85.6993263983062]
Deep unrolling is an emerging learning-to-optimize method that unrolls a truncated iterative algorithm in the layers of a trainable neural network.
We show that convergence guarantees and generalizability of the unrolled networks are still open theoretical problems.
We numerically assess unrolled architectures trained under the proposed constraints in two different applications.
arXiv Detail & Related papers (2023-12-25T18:51:23Z) - A New Computationally Simple Approach for Implementing Neural Networks
with Output Hard Constraints [5.482532589225552]
A new method of imposing hard convex constraints on the neural network output values is proposed.
The mapping is implemented by the additional neural network layer with constraints for output.
The proposed method is simply extended to the case when constraints are imposed not only on the output vectors, but also on joint constraints depending on inputs.
arXiv Detail & Related papers (2023-07-19T21:06:43Z) - Neural Fields with Hard Constraints of Arbitrary Differential Order [61.49418682745144]
We develop a series of approaches for enforcing hard constraints on neural fields.
The constraints can be specified as a linear operator applied to the neural field and its derivatives.
Our approaches are demonstrated in a wide range of real-world applications.
arXiv Detail & Related papers (2023-06-15T08:33:52Z) - Provable Reinforcement Learning with a Short-Term Memory [68.00677878812908]
We study a new subclass of POMDPs, whose latent states can be decoded by the most recent history of a short length $m$.
In particular, in the rich-observation setting, we develop new algorithms using a novel "moment matching" approach with a sample complexity that scales exponentially.
Our results show that a short-term memory suffices for reinforcement learning in these environments.
arXiv Detail & Related papers (2022-02-08T16:39:57Z) - Constrained Machine Learning: The Bagel Framework [5.945320097465419]
Constrained machine learning problems are problems where learned models have to both be accurate and respect constraints.
The goal of this paper is to broaden the modeling capacity of constrained machine learning problems by incorporating existing work from optimization.
Because machine learning has specific requirements, we also propose an extended table constraint to split the space of hypotheses.
arXiv Detail & Related papers (2021-12-02T10:10:20Z) - Adaptive Discretization in Online Reinforcement Learning [9.560980936110234]
Two major questions in designing discretization-based algorithms are how to create the discretization and when to refine it.
We provide a unified theoretical analysis of tree-based hierarchical partitioning methods for online reinforcement learning.
Our algorithms are easily adapted to operating constraints, and our theory provides explicit bounds across each of the three facets.
arXiv Detail & Related papers (2021-10-29T15:06:15Z) - CoreDiag: Eliminating Redundancy in Constraint Sets [68.8204255655161]
We present a new algorithm which can be exploited for the determination of minimal cores (minimal non-redundant constraint sets)
The algorithm is especially useful for distributed knowledge engineering scenarios where the degree of redundancy can become high.
In order to show the applicability of our approach, we present an empirical study conducted with commercial configuration knowledge bases.
arXiv Detail & Related papers (2021-02-24T09:16:10Z) - Local Propagation in Constraint-based Neural Network [77.37829055999238]
We study a constraint-based representation of neural network architectures.
We investigate a simple optimization procedure that is well suited to fulfil the so-called architectural constraints.
arXiv Detail & Related papers (2020-02-18T16:47:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.