Binary Multi Channel Morphological Neural Network
- URL: http://arxiv.org/abs/2204.08768v1
- Date: Tue, 19 Apr 2022 09:26:11 GMT
- Title: Binary Multi Channel Morphological Neural Network
- Authors: Theodore Aouad and Hugues Talbot
- Abstract summary: We introduce a Binary Morphological Neural Network (BiMoNN) built upon the convolutional neural network.
We demonstrate an equivalence between BiMoNNs and morphological operators that we can use to binarize entire networks.
These can learn classical morphological operators and show promising results on a medical imaging application.
- Score: 5.551756485554158
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural networks and particularly Deep learning have been comparatively little
studied from the theoretical point of view. Conversely, Mathematical Morphology
is a discipline with solid theoretical foundations. We combine these domains to
propose a new type of neural architecture that is theoretically more
explainable. We introduce a Binary Morphological Neural Network (BiMoNN) built
upon the convolutional neural network. We design it for learning morphological
networks with binary inputs and outputs. We demonstrate an equivalence between
BiMoNNs and morphological operators that we can use to binarize entire
networks. These can learn classical morphological operators and show promising
results on a medical imaging application.
Related papers
- Unsupervised representation learning with Hebbian synaptic and structural plasticity in brain-like feedforward neural networks [0.0]
We introduce and evaluate a brain-like neural network model capable of unsupervised representation learning.
The model was tested on a diverse set of popular machine learning benchmarks.
arXiv Detail & Related papers (2024-06-07T08:32:30Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Riemannian Residual Neural Networks [58.925132597945634]
We show how to extend the residual neural network (ResNet)
ResNets have become ubiquitous in machine learning due to their beneficial learning properties, excellent empirical results, and easy-to-incorporate nature when building varied neural networks.
arXiv Detail & Related papers (2023-10-16T02:12:32Z) - An Algorithm to Train Unrestricted Sequential Discrete Morphological
Neural Networks [0.0]
We propose an algorithm to learn unrestricted sequential DMNN, whose architecture is given by the composition of general W-operators.
We illustrate the algorithm in a practical example.
arXiv Detail & Related papers (2023-10-06T20:55:05Z) - Linear Leaky-Integrate-and-Fire Neuron Model Based Spiking Neural
Networks and Its Mapping Relationship to Deep Neural Networks [7.840247953745616]
Spiking neural networks (SNNs) are brain-inspired machine learning algorithms with merits such as biological plausibility and unsupervised learning capability.
This paper establishes a precise mathematical mapping between the biological parameters of the Linear Leaky-Integrate-and-Fire model (LIF)/SNNs and the parameters of ReLU-AN/Deep Neural Networks (DNNs)
arXiv Detail & Related papers (2022-05-31T17:02:26Z) - Functional2Structural: Cross-Modality Brain Networks Representation
Learning [55.24969686433101]
Graph mining on brain networks may facilitate the discovery of novel biomarkers for clinical phenotypes and neurodegenerative diseases.
We propose a novel graph learning framework, known as Deep Signed Brain Networks (DSBN), with a signed graph encoder.
We validate our framework on clinical phenotype and neurodegenerative disease prediction tasks using two independent, publicly available datasets.
arXiv Detail & Related papers (2022-05-06T03:45:36Z) - Binary Morphological Neural Network [5.551756485554158]
We create a neural morphological network that handles binary inputs and outputs.
We propose their construction inspired by CNNs to formulate layers adapted to such images by replacing convolutions with erosions and dilations.
We present promising experimental results designed to learn basic binary operators.
arXiv Detail & Related papers (2022-03-23T11:30:34Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - Dynamic Inference with Neural Interpreters [72.90231306252007]
We present Neural Interpreters, an architecture that factorizes inference in a self-attention network as a system of modules.
inputs to the model are routed through a sequence of functions in a way that is end-to-end learned.
We show that Neural Interpreters perform on par with the vision transformer using fewer parameters, while being transferrable to a new task in a sample efficient manner.
arXiv Detail & Related papers (2021-10-12T23:22:45Z) - A multi-agent model for growing spiking neural networks [0.0]
This project has explored rules for growing the connections between the neurons in Spiking Neural Networks as a learning mechanism.
Results in a simulation environment showed that for a given set of parameters it is possible to reach topologies that reproduce the tested functions.
This project also opens the door to the usage of techniques like genetic algorithms for obtaining the best suited values for the model parameters.
arXiv Detail & Related papers (2020-09-21T15:11:29Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.