Multi-Label Classification Neural Networks with Hard Logical Constraints
- URL: http://arxiv.org/abs/2103.13427v1
- Date: Wed, 24 Mar 2021 18:13:56 GMT
- Title: Multi-Label Classification Neural Networks with Hard Logical Constraints
- Authors: Eleonora Giunchiglia and Thomas Lukasiewicz
- Abstract summary: We propose a novel approach for solving hierarchical multi-label classification (HMC) problems.
C-HMCNN(h) exploits hierarchy information in order to produce predictions coherent with the constraints and to improve performance.
We also propose a new model CCN(h) which extends C-HMCNN(h) and is again able to satisfy and exploit the constraints to improve performance.
- Score: 45.99924614659817
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-label classification (MC) is a standard machine learning problem in
which a data point can be associated with a set of classes. A more challenging
scenario is given by hierarchical multi-label classification (HMC) problems, in
which every prediction must satisfy a given set of hard constraints expressing
subclass relationships between classes. In this paper, we propose C-HMCNN(h), a
novel approach for solving HMC problems, which, given a network h for the
underlying MC problem, exploits the hierarchy information in order to produce
predictions coherent with the constraints and to improve performance.
Furthermore, we extend the logic used to express HMC constraints in order to be
able to specify more complex relations among the classes and propose a new
model CCN(h), which extends C-HMCNN(h) and is again able to satisfy and exploit
the constraints to improve performance. We conduct an extensive experimental
analysis showing the superior performance of both C-HMCNN(h) and CCN(h) when
compared to state-of-the-art models in both the HMC and the general MC setting
with hard logical constraints.
Related papers
- HMIL: Hierarchical Multi-Instance Learning for Fine-Grained Whole Slide Image Classification [10.203984731917851]
Fine-grained classification of whole slide images (WSIs) is essential in precision oncology, enabling precise cancer diagnosis and personalized treatment strategies.
While the multi-instance learning (MIL) paradigm alleviates the computational burden of WSIs, existing MIL methods often overlook hierarchical label correlations.
We introduce a novel hierarchical multi-instance learning (HMIL) framework to overcome these limitations.
arXiv Detail & Related papers (2024-11-12T09:22:00Z) - Coding for Intelligence from the Perspective of Category [66.14012258680992]
Coding targets compressing and reconstructing data, and intelligence.
Recent trends demonstrate the potential homogeneity of these two fields.
We propose a novel problem of Coding for Intelligence from the category theory view.
arXiv Detail & Related papers (2024-07-01T07:05:44Z) - Solving Satisfiability Modulo Counting for Symbolic and Statistical AI
Integration With Provable Guarantees [18.7083987727973]
Satisfiability Modulo Counting (SMC) encompasses problems that require both symbolic decision-making and statistical reasoning.
XOR-SMC transforms the highly intractable SMC into satisfiability problems, by replacing the model counting in SMC with SAT formulae.
XOR-SMC finds solutions close to the true optimum, outperforming several baselines which struggle to find good approximations for the intractable model counting in SMC.
arXiv Detail & Related papers (2023-09-16T05:34:59Z) - Balanced Classification: A Unified Framework for Long-Tailed Object
Detection [74.94216414011326]
Conventional detectors suffer from performance degradation when dealing with long-tailed data due to a classification bias towards the majority head categories.
We introduce a unified framework called BAlanced CLassification (BACL), which enables adaptive rectification of inequalities caused by disparities in category distribution.
BACL consistently achieves performance improvements across various datasets with different backbones and architectures.
arXiv Detail & Related papers (2023-08-04T09:11:07Z) - On Leave-One-Out Conditional Mutual Information For Generalization [122.2734338600665]
We derive information theoretic generalization bounds for supervised learning algorithms based on a new measure of leave-one-out conditional mutual information (loo-CMI)
Contrary to other CMI bounds, our loo-CMI bounds can be computed easily and can be interpreted in connection to other notions such as classical leave-one-out cross-validation.
We empirically validate the quality of the bound by evaluating its predicted generalization gap in scenarios for deep learning.
arXiv Detail & Related papers (2022-07-01T17:58:29Z) - Exact Learning of Qualitative Constraint Networks from Membership
Queries [2.9005223064604078]
A constraint network (QCN) is a constraint graph for representing problems under qualitative temporal and spatial relations.
We propose a new algorithm for learning, through membership queries, a QCN from a non expert.
The goal here is to reduce the number of membership queries needed to reach the target QCN.
arXiv Detail & Related papers (2021-09-23T22:25:37Z) - Continual Competitive Memory: A Neural System for Online Task-Free
Lifelong Learning [91.3755431537592]
We propose a novel form of unsupervised learning, continual competitive memory ( CCM)
The resulting neural system is shown to offer an effective approach for combating catastrophic forgetting in online continual classification problems.
We demonstrate that the proposed CCM system not only outperforms other competitive learning neural models but also yields performance that is competitive with several modern, state-of-the-art lifelong learning approaches.
arXiv Detail & Related papers (2021-06-24T20:12:17Z) - Coherent Hierarchical Multi-Label Classification Networks [56.41950277906307]
C-HMCNN(h) is a novel approach for HMC problems, which exploits hierarchy information in order to produce predictions coherent with the constraint and improve performance.
We conduct an extensive experimental analysis showing the superior performance of C-HMCNN(h) when compared to state-of-the-art models.
arXiv Detail & Related papers (2020-10-20T09:37:02Z) - Investigating Class-level Difficulty Factors in Multi-label
Classification Problems [23.51529285126783]
This work investigates the use of class-level difficulty factors in multi-label classification problems for the first time.
Four difficulty factors are proposed: frequency, visual variation, semantic abstraction, and class co-occurrence.
These difficulty factors are shown to have several potential applications including the prediction of class-level performance across datasets.
arXiv Detail & Related papers (2020-05-01T15:06:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.