Fuzzy Logic Visual Network (FLVN): A neuro-symbolic approach for visual
features matching
- URL: http://arxiv.org/abs/2307.16019v1
- Date: Sat, 29 Jul 2023 16:21:40 GMT
- Title: Fuzzy Logic Visual Network (FLVN): A neuro-symbolic approach for visual
features matching
- Authors: Francesco Manigrasso and Lia Morra and Fabrizio Lamberti
- Abstract summary: We present the Fuzzy Logic Visual Network (FLVN) that formulates the task of learning a visual-semantic embedding space within a neuro-symbolic LTN framework.
FLVN incorporates prior knowledge in the form of class hierarchies (classes and macro-classes) along with robust high-level inductive biases.
It achieves competitive performance to recent ZSL methods with less computational overhead.
- Score: 6.128849673587451
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuro-symbolic integration aims at harnessing the power of symbolic knowledge
representation combined with the learning capabilities of deep neural networks.
In particular, Logic Tensor Networks (LTNs) allow to incorporate background
knowledge in the form of logical axioms by grounding a first order logic
language as differentiable operations between real tensors. Yet, few studies
have investigated the potential benefits of this approach to improve zero-shot
learning (ZSL) classification. In this study, we present the Fuzzy Logic Visual
Network (FLVN) that formulates the task of learning a visual-semantic embedding
space within a neuro-symbolic LTN framework. FLVN incorporates prior knowledge
in the form of class hierarchies (classes and macro-classes) along with robust
high-level inductive biases. The latter allow, for instance, to handle
exceptions in class-level attributes, and to enforce similarity between images
of the same class, preventing premature overfitting to seen classes and
improving overall performance. FLVN reaches state of the art performance on the
Generalized ZSL (GZSL) benchmarks AWA2 and CUB, improving by 1.3% and 3%,
respectively. Overall, it achieves competitive performance to recent ZSL
methods with less computational overhead. FLVN is available at
https://gitlab.com/grains2/flvn.
Related papers
- Neural Reasoning Networks: Efficient Interpretable Neural Networks With Automatic Textual Explanations [45.974930902038494]
We propose a novel neuro-symbolic architecture, Neural Reasoning Networks (NRN), that is scalable and generates logically textual explanations for its predictions.
A training algorithm (R-NRN) learns the weights of the network as usual using descent optimization with backprop, but also learns the network structure itself using a bandit-based optimization.
R-NRN explanations are shorter than the compared approaches while producing more accurate feature importance scores.
arXiv Detail & Related papers (2024-10-10T14:27:12Z) - Half-Space Feature Learning in Neural Networks [2.3249139042158853]
There currently exist two extreme viewpoints for neural network feature learning.
We argue neither interpretation is likely to be correct based on a novel viewpoint.
We use this alternate interpretation to motivate a model, called the Deep Linearly Gated Network (DLGN)
arXiv Detail & Related papers (2024-04-05T12:03:19Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - Neuro-symbolic Rule Learning in Real-world Classification Tasks [75.0907310059298]
We extend pix2rule's neural DNF module to support rule learning in real-world multi-class and multi-label classification tasks.
We propose a novel extended model called neural DNF-EO (Exactly One) which enforces mutual exclusivity in multi-class classification.
arXiv Detail & Related papers (2023-03-29T13:27:14Z) - Two-level Graph Network for Few-Shot Class-Incremental Learning [7.815043173207539]
Few-shot class-incremental learning (FSCIL) aims to design machine learning algorithms that can continually learn new concepts from a few data points.
Existing FSCIL methods ignore the semantic relationships between sample-level and class-level.
In this paper, we designed a two-level graph network for FSCIL named Sample-level and Class-level Graph Neural Network (SCGN)
arXiv Detail & Related papers (2023-03-24T08:58:08Z) - PROTOtypical Logic Tensor Networks (PROTO-LTN) for Zero Shot Learning [2.236663830879273]
Logic Networks (LTNs) are neuro-symbolic systems based on a differentiable, first-order logic grounded into a deep neural network.
We focus here on the subsumption or textttisOfClass predicate, which is fundamental to encode most semantic image interpretation tasks.
We propose a common textttisOfClass predicate, whose level of truth is a function of the distance between an object embedding and the corresponding class prototype.
arXiv Detail & Related papers (2022-06-26T18:34:07Z) - Knowledge Enhanced Neural Networks for relational domains [83.9217787335878]
We focus on a specific method, KENN, a Neural-Symbolic architecture that injects prior logical knowledge into a neural network.
In this paper, we propose an extension of KENN for relational data.
arXiv Detail & Related papers (2022-05-31T13:00:34Z) - On Feature Learning in Neural Networks with Global Convergence
Guarantees [49.870593940818715]
We study the optimization of wide neural networks (NNs) via gradient flow (GF)
We show that when the input dimension is no less than the size of the training set, the training loss converges to zero at a linear rate under GF.
We also show empirically that, unlike in the Neural Tangent Kernel (NTK) regime, our multi-layer model exhibits feature learning and can achieve better generalization performance than its NTK counterpart.
arXiv Detail & Related papers (2022-04-22T15:56:43Z) - Isometric Propagation Network for Generalized Zero-shot Learning [72.02404519815663]
A popular strategy is to learn a mapping between the semantic space of class attributes and the visual space of images based on the seen classes and their data.
We propose Isometric propagation Network (IPN), which learns to strengthen the relation between classes within each space and align the class dependency in the two spaces.
IPN achieves state-of-the-art performance on three popular Zero-shot learning benchmarks.
arXiv Detail & Related papers (2021-02-03T12:45:38Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.