AU-NN: ANFIS Unit Neural Network
- URL: http://arxiv.org/abs/2204.11839v1
- Date: Thu, 21 Apr 2022 18:11:34 GMT
- Title: AU-NN: ANFIS Unit Neural Network
- Authors: Tonatiuh Hern\'andez-del-Toro, Carlos A. Reyes-Garc\'ia, Luis
Villase\~nor-Pineda
- Abstract summary: This paper describes the ANFIS Unit Neural Network, a deep neural network where each neuron is an independent ANFIS.
Two use cases of this network are shown to test the capability of the network.
- Score: 0.16058099298620418
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper is described the ANFIS Unit Neural Network, a deep neural
network where each neuron is an independent ANFIS. Two use cases of this
network are shown to test the capability of the network. (i) Classification of
five imagined words. (ii) Incremental learning in the task of detecting
Imagined Word Segments vs. Idle State Segments. In both cases, the proposed
network outperforms the conventional methods. Additionally, is described a
process of classification where instead of taking the whole instance as one
example, each instance is decomposed into a set of smaller instances, and the
classification is done by a majority vote over all the predictions of the set.
The codes to build the AU-NN used in this paper, are available on the github
repository https://github.com/tonahdztoro/AU_NN.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Coin Flipping Neural Networks [8.009932864430901]
We show that neural networks with access to randomness can outperform deterministic networks by using amplification.
We conjecture that for most classification problems, there is a CFNN which solves them with higher accuracy or fewer neurons than any deterministic network.
arXiv Detail & Related papers (2022-06-18T11:19:44Z) - Wide and Deep Neural Networks Achieve Optimality for Classification [23.738242876364865]
We identify and construct an explicit set of neural network classifiers that achieve optimality.
In particular, we provide explicit activation functions that can be used to construct networks that achieve optimality.
Our results highlight the benefit of using deep networks for classification tasks, in contrast to regression tasks, where excessive depth is harmful.
arXiv Detail & Related papers (2022-04-29T14:27:42Z) - On the Neural Tangent Kernel Analysis of Randomly Pruned Neural Networks [91.3755431537592]
We study how random pruning of the weights affects a neural network's neural kernel (NTK)
In particular, this work establishes an equivalence of the NTKs between a fully-connected neural network and its randomly pruned version.
arXiv Detail & Related papers (2022-03-27T15:22:19Z) - Self-Ensembling GAN for Cross-Domain Semantic Segmentation [107.27377745720243]
This paper proposes a self-ensembling generative adversarial network (SE-GAN) exploiting cross-domain data for semantic segmentation.
In SE-GAN, a teacher network and a student network constitute a self-ensembling model for generating semantic segmentation maps, which together with a discriminator, forms a GAN.
Despite its simplicity, we find SE-GAN can significantly boost the performance of adversarial training and enhance the stability of the model.
arXiv Detail & Related papers (2021-12-15T09:50:25Z) - Dive into Layers: Neural Network Capacity Bounding using Algebraic
Geometry [55.57953219617467]
We show that the learnability of a neural network is directly related to its size.
We use Betti numbers to measure the topological geometric complexity of input data and the neural network.
We perform the experiments on a real-world dataset MNIST and the results verify our analysis and conclusion.
arXiv Detail & Related papers (2021-09-03T11:45:51Z) - Redundant representations help generalization in wide neural networks [71.38860635025907]
We study the last hidden layer representations of various state-of-the-art convolutional neural networks.
We find that if the last hidden representation is wide enough, its neurons tend to split into groups that carry identical information, and differ from each other only by statistically independent noise.
arXiv Detail & Related papers (2021-06-07T10:18:54Z) - Analyzing Representations inside Convolutional Neural Networks [8.803054559188048]
We propose a framework to categorize the concepts a network learns based on the way it clusters a set of input examples.
This framework is unsupervised and can work without any labels for input features.
We extensively evaluate the proposed method and demonstrate that it produces human-understandable and coherent concepts.
arXiv Detail & Related papers (2020-12-23T07:10:17Z) - Locality Guided Neural Networks for Explainable Artificial Intelligence [12.435539489388708]
We propose a novel algorithm for back propagation, called Locality Guided Neural Network(LGNN)
LGNN preserves locality between neighbouring neurons within each layer of a deep network.
In our experiments, we train various VGG and Wide ResNet (WRN) networks for image classification on CIFAR100.
arXiv Detail & Related papers (2020-07-12T23:45:51Z) - Graph Prototypical Networks for Few-shot Learning on Attributed Networks [72.31180045017835]
We propose a graph meta-learning framework -- Graph Prototypical Networks (GPN)
GPN is able to perform textitmeta-learning on an attributed network and derive a highly generalizable model for handling the target classification task.
arXiv Detail & Related papers (2020-06-23T04:13:23Z) - BUSU-Net: An Ensemble U-Net Framework for Medical Image Segmentation [0.0]
convolutional neural networks (CNNs) have revolutionized medical image analysis.
We propose an ensemble deep neural network with an underlying U-Net framework.
We show that this ensemble network outperforms recent state-of-the-art networks in several evaluation metrics.
arXiv Detail & Related papers (2020-03-03T15:18:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.