Deep Neural Network for Constraint Acquisition through Tailored Loss
Function
- URL: http://arxiv.org/abs/2403.02042v1
- Date: Mon, 4 Mar 2024 13:47:33 GMT
- Title: Deep Neural Network for Constraint Acquisition through Tailored Loss
Function
- Authors: Eduardo Vyhmeister, Rocio Paez, Gabriel Gonzalez
- Abstract summary: The significance of learning constraints from data is underscored by its potential applications in real-world problem-solving.
This work introduces a novel approach grounded in Deep Neural Network (DNN) based on Symbolic Regression.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The significance of learning constraints from data is underscored by its
potential applications in real-world problem-solving. While constraints are
popular for modeling and solving, the approaches to learning constraints from
data remain relatively scarce. Furthermore, the intricate task of modeling
demands expertise and is prone to errors, thus constraint acquisition methods
offer a solution by automating this process through learnt constraints from
examples or behaviours of solutions and non-solutions. This work introduces a
novel approach grounded in Deep Neural Network (DNN) based on Symbolic
Regression that, by setting suitable loss functions, constraints can be
extracted directly from datasets. Using the present approach, direct
formulation of constraints was achieved. Furthermore, given the broad
pre-developed architectures and functionalities of DNN, connections and
extensions with other frameworks could be foreseen.
Related papers
- Neural Fields with Hard Constraints of Arbitrary Differential Order [61.49418682745144]
We develop a series of approaches for enforcing hard constraints on neural fields.
The constraints can be specified as a linear operator applied to the neural field and its derivatives.
Our approaches are demonstrated in a wide range of real-world applications.
arXiv Detail & Related papers (2023-06-15T08:33:52Z) - Constrained Empirical Risk Minimization: Theory and Practice [2.4934936799100034]
We present a framework that allows the exact enforcement of constraints on parameterized sets of functions such as Deep Neural Networks (DNNs)
We focus on constraints that are outside the scope of equivariant networks used in Geometric Deep Learning.
As a major example of the framework, we restrict filters of a Convolutional Neural Network (CNN) to be wavelets, and apply these wavelet networks to the task of contour prediction in the medical domain.
arXiv Detail & Related papers (2023-02-09T16:11:58Z) - On data-driven chance constraint learning for mixed-integer optimization
problems [0.0]
We develop a Chance Constraint Learning (CCL) methodology with a focus on mixed-integer linear optimization problems.
CCL makes use of linearizable machine learning models to estimate conditional quantiles of the learned variables.
An open-access software has been developed to be used by practitioners.
arXiv Detail & Related papers (2022-07-08T11:54:39Z) - Constraint Guided Gradient Descent: Guided Training with Inequality
Constraints [0.0]
Constraint Guided Gradient Descent (CGGD) is proposed that enables the injection of domain knowledge into the training procedure.
CGGD converges to a model that satisfies any inequality constraint on the training data.
It is empirically shown on two independent and small data sets that CGGD makes training less dependent on the initialisation of the network.
arXiv Detail & Related papers (2022-06-13T14:33:33Z) - Sufficiently Accurate Model Learning for Planning [119.80502738709937]
This paper introduces the constrained Sufficiently Accurate model learning approach.
It provides examples of such problems, and presents a theorem on how close some approximate solutions can be.
The approximate solution quality will depend on the function parameterization, loss and constraint function smoothness, and the number of samples in model learning.
arXiv Detail & Related papers (2021-02-11T16:27:31Z) - Developing Constrained Neural Units Over Time [81.19349325749037]
This paper focuses on an alternative way of defining Neural Networks, that is different from the majority of existing approaches.
The structure of the neural architecture is defined by means of a special class of constraints that are extended also to the interaction with data.
The proposed theory is cast into the time domain, in which data are presented to the network in an ordered manner.
arXiv Detail & Related papers (2020-09-01T09:07:25Z) - Differentiable Causal Discovery from Interventional Data [141.41931444927184]
We propose a theoretically-grounded method based on neural networks that can leverage interventional data.
We show that our approach compares favorably to the state of the art in a variety of settings.
arXiv Detail & Related papers (2020-07-03T15:19:17Z) - An Integer Linear Programming Framework for Mining Constraints from Data [81.60135973848125]
We present a general framework for mining constraints from data.
In particular, we consider the inference in structured output prediction as an integer linear programming (ILP) problem.
We show that our approach can learn to solve 9x9 Sudoku puzzles and minimal spanning tree problems from examples without providing the underlying rules.
arXiv Detail & Related papers (2020-06-18T20:09:53Z) - Local Propagation in Constraint-based Neural Network [77.37829055999238]
We study a constraint-based representation of neural network architectures.
We investigate a simple optimization procedure that is well suited to fulfil the so-called architectural constraints.
arXiv Detail & Related papers (2020-02-18T16:47:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.