Neural Network-Based Rule Models With Truth Tables
- URL: http://arxiv.org/abs/2309.09638v1
- Date: Mon, 18 Sep 2023 10:13:59 GMT
- Title: Neural Network-Based Rule Models With Truth Tables
- Authors: Adrien Benamira, Tristan Gu\'erand, Thomas Peyrin, Hans Soegeng
- Abstract summary: We introduce a neural network framework that combines the global and exact interpretability properties of rule-based models with the high performance of deep neural networks.
Our proposed framework, called $textitTruth Table rules$ (TT-rules), is built upon $textitTruth Table nets$ (TTnets)
- Score: 5.187307904567701
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Understanding the decision-making process of a machine/deep learning model is
crucial, particularly in security-sensitive applications. In this study, we
introduce a neural network framework that combines the global and exact
interpretability properties of rule-based models with the high performance of
deep neural networks.
Our proposed framework, called $\textit{Truth Table rules}$ (TT-rules), is
built upon $\textit{Truth Table nets}$ (TTnets), a family of deep neural
networks initially developed for formal verification. By extracting the set of
necessary and sufficient rules $\mathcal{R}$ from the trained TTnet model
(global interpretability), yielding the same output as the TTnet (exact
interpretability), TT-rules effectively transforms the neural network into a
rule-based model. This rule-based model supports binary classification,
multi-label classification, and regression tasks for tabular datasets.
Furthermore, our TT-rules framework optimizes the rule set $\mathcal{R}$ into
$\mathcal{R}_{opt}$ by reducing the number and size of the rules. To enhance
model interpretation, we leverage Reduced Ordered Binary Decision Diagrams
(ROBDDs) to visualize these rules effectively.
After outlining the framework, we evaluate the performance of TT-rules on
seven tabular datasets from finance, healthcare, and justice domains. We also
compare the TT-rules framework to state-of-the-art rule-based methods. Our
results demonstrate that TT-rules achieves equal or higher performance compared
to other interpretable methods while maintaining a balance between performance
and complexity. Notably, TT-rules presents the first accurate rule-based model
capable of fitting large tabular datasets, including two real-life DNA datasets
with over 20K features. Finally, we extensively investigate a rule-based model
derived from TT-rules using the Adult dataset.
Related papers
- Best of Both Worlds: A Pliable and Generalizable Neuro-Symbolic Approach
for Relation Classification [17.398872494876365]
This paper introduces a novel neuro-symbolic architecture for relation classification (RC)
It combines rule-based methods with contemporary deep learning techniques.
We show that our proposed method outperforms previous state-of-the-art models in three out of four settings.
arXiv Detail & Related papers (2024-03-05T20:08:32Z) - Learning Interpretable Rules for Scalable Data Representation and
Classification [11.393431987232425]
Rule-based Learner Representation (RRL) learns interpretable non-fuzzy rules for data representation and classification.
RRL can be easily adjusted to obtain a trade-off between classification accuracy and model complexity for different scenarios.
arXiv Detail & Related papers (2023-10-22T15:55:58Z) - A New Interpretable Neural Network-Based Rule Model for Healthcare
Decision Making [5.666761232081187]
In this study, we introduce a neural network framework, $textitTruth Table rules$ (TT-rules), that combines the global and exact interpretability properties of rule-based models with the high performance of deep neural networks.
TT-rules is built upon $textitTruth Table nets$ (TTnet), a family of deep neural networks initially developed for formal verification.
arXiv Detail & Related papers (2023-09-20T07:15:48Z) - CGXplain: Rule-Based Deep Neural Network Explanations Using Dual Linear
Programs [4.632241550169363]
Rule-based surrogate models are an effective way to approximate a Deep Neural Network's (DNN) decision boundaries.
This paper introduces the CGX (Column Generation eXplainer) to address these limitations.
arXiv Detail & Related papers (2023-04-11T13:16:26Z) - Neuro-symbolic Rule Learning in Real-world Classification Tasks [75.0907310059298]
We extend pix2rule's neural DNF module to support rule learning in real-world multi-class and multi-label classification tasks.
We propose a novel extended model called neural DNF-EO (Exactly One) which enforces mutual exclusivity in multi-class classification.
arXiv Detail & Related papers (2023-03-29T13:27:14Z) - Robust Training and Verification of Implicit Neural Networks: A
Non-Euclidean Contractive Approach [64.23331120621118]
This paper proposes a theoretical and computational framework for training and robustness verification of implicit neural networks.
We introduce a related embedded network and show that the embedded network can be used to provide an $ell_infty$-norm box over-approximation of the reachable sets of the original network.
We apply our algorithms to train implicit neural networks on the MNIST dataset and compare the robustness of our models with the models trained via existing approaches in the literature.
arXiv Detail & Related papers (2022-08-08T03:13:24Z) - Exploiting Low-Rank Tensor-Train Deep Neural Networks Based on
Riemannian Gradient Descent With Illustrations of Speech Processing [74.31472195046099]
We exploit a low-rank tensor-train deep neural network (TT-DNN) to build an end-to-end deep learning pipeline, namely LR-TT-DNN.
A hybrid model combining LR-TT-DNN with a convolutional neural network (CNN) is set up to boost the performance.
Our empirical evidence demonstrates that the LR-TT-DNN and CNN+(LR-TT-DNN) models with fewer model parameters can outperform the TT-DNN and CNN+(LR-TT-DNN) counterparts.
arXiv Detail & Related papers (2022-03-11T15:55:34Z) - Robustness Certificates for Implicit Neural Networks: A Mixed Monotone
Contractive Approach [60.67748036747221]
Implicit neural networks offer competitive performance and reduced memory consumption.
They can remain brittle with respect to input adversarial perturbations.
This paper proposes a theoretical and computational framework for robustness verification of implicit neural networks.
arXiv Detail & Related papers (2021-12-10T03:08:55Z) - Sequence Transduction with Graph-based Supervision [96.04967815520193]
We present a new transducer objective function that generalizes the RNN-T loss to accept a graph representation of the labels.
We demonstrate that transducer-based ASR with CTC-like lattice achieves better results compared to standard RNN-T.
arXiv Detail & Related papers (2021-11-01T21:51:42Z) - Integrating Regular Expressions with Neural Networks via DFA [40.09868407372605]
It is very important to integrate the rule knowledge into neural networks to build a hybrid model that achieves better performance.
Specifically, the human-designed rules are formulated as Regular Expressions (REs)
We propose to use the MDFA as an intermediate model to capture the matched RE patterns as rule-based features for each input sentence.
arXiv Detail & Related papers (2021-09-07T05:48:51Z) - Learning Reasoning Strategies in End-to-End Differentiable Proving [50.9791149533921]
Conditional Theorem Provers learn optimal rule selection strategy via gradient-based optimisation.
We show that Conditional Theorem Provers are scalable and yield state-of-the-art results on the CLUTRR dataset.
arXiv Detail & Related papers (2020-07-13T16:22:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.