An Algorithm to Train Unrestricted Sequential Discrete Morphological
Neural Networks
- URL: http://arxiv.org/abs/2310.04584v2
- Date: Fri, 2 Feb 2024 15:40:51 GMT
- Title: An Algorithm to Train Unrestricted Sequential Discrete Morphological
Neural Networks
- Authors: Diego Marcondes, Mariana Feldman and Junior Barrera
- Abstract summary: We propose an algorithm to learn unrestricted sequential DMNN, whose architecture is given by the composition of general W-operators.
We illustrate the algorithm in a practical example.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: There have been attempts to insert mathematical morphology (MM) operators
into convolutional neural networks (CNN), and the most successful endeavor to
date has been the morphological neural networks (MNN). Although MNN have
performed better than CNN in solving some problems, they inherit their
black-box nature. Furthermore, in the case of binary images, they are
approximations that loose the Boolean lattice structure of MM operators and,
thus, it is not possible to represent a specific class of W-operators with
desired properties. In a recent work, we proposed the Discrete Morphological
Neural Networks (DMNN) for binary image transformation to represent specific
classes of W-operators and estimate them via machine learning. We also proposed
a stochastic lattice descent algorithm (SLDA) to learn the parameters of
Canonical Discrete Morphological Neural Networks (CDMNN), whose architecture is
composed only of operators that can be decomposed as the supremum, infimum, and
complement of erosions and dilations. In this paper, we propose an algorithm to
learn unrestricted sequential DMNN, whose architecture is given by the
composition of general W-operators. We illustrate the algorithm in a practical
example.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - A foundation for exact binarized morphological neural networks [2.8925699537310137]
Training and running deep neural networks (NNs) often demands a lot of computation and energy-intensive specialized hardware.
One way to reduce the computation and power cost is to use binary weight NNs, but these are hard to train because the sign function has a non-smooth gradient.
We present a model based on Mathematical Morphology (MM), which can binarize ConvNets without losing performance under certain conditions.
arXiv Detail & Related papers (2024-01-08T11:37:44Z) - Discrete Morphological Neural Networks [0.0]
We propose the Discrete Morphological Neural Networks (DMNN) for binary image analysis to represent W-operators.
As a proof-of-concept, we apply the DMNN to recognize the boundary of digits with noise.
arXiv Detail & Related papers (2023-09-01T17:04:48Z) - Resolution-Invariant Image Classification based on Fourier Neural
Operators [1.3190581566723918]
We investigate the use of generalization Neural Operators (FNOs) for image classification in comparison to standard Convolutional Neural Networks (CNNs)
We derive the FNO architecture as an example for continuous and Fr'echet-differentiable neural operators on Lebesgue spaces.
arXiv Detail & Related papers (2023-04-02T10:23:36Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Binary Multi Channel Morphological Neural Network [5.551756485554158]
We introduce a Binary Morphological Neural Network (BiMoNN) built upon the convolutional neural network.
We demonstrate an equivalence between BiMoNNs and morphological operators that we can use to binarize entire networks.
These can learn classical morphological operators and show promising results on a medical imaging application.
arXiv Detail & Related papers (2022-04-19T09:26:11Z) - Universal approximation property of invertible neural networks [76.95927093274392]
Invertible neural networks (INNs) are neural network architectures with invertibility by design.
Thanks to their invertibility and the tractability of Jacobian, INNs have various machine learning applications such as probabilistic modeling, generative modeling, and representation learning.
arXiv Detail & Related papers (2022-04-15T10:45:26Z) - Learning Deep Morphological Networks with Neural Architecture Search [19.731352645511052]
We propose a method based on meta-learning to incorporate morphological operators into Deep Neural Networks.
The learned architecture demonstrates how our novel morphological operations significantly increase DNN performance on various tasks.
arXiv Detail & Related papers (2021-06-14T19:19:48Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.