Gated recurrent units and temporal convolutional network for multilabel
classification
- URL: http://arxiv.org/abs/2110.04414v1
- Date: Sat, 9 Oct 2021 00:00:16 GMT
- Title: Gated recurrent units and temporal convolutional network for multilabel
classification
- Authors: Loris Nanni, Alessandra Lumini, Alessandro Manfe, Sheryl Brahnam and
Giorgio Venturin
- Abstract summary: This work proposes a new ensemble method for managing multilabel classification.
The core of the proposed approach combines a set of gated recurrent units and temporal convolutional neural networks trained with variants of the Adam gradients optimization approach.
- Score: 122.84638446560663
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multilabel learning tackles the problem of associating a sample with multiple
class labels. This work proposes a new ensemble method for managing multilabel
classification: the core of the proposed approach combines a set of gated
recurrent units and temporal convolutional neural networks trained with
variants of the Adam optimization approach. Multiple Adam variants, including
novel one proposed here, are compared and tested; these variants are based on
the difference between present and past gradients, with step size adjusted for
each parameter. The proposed neural network approach is also combined with
Incorporating Multiple Clustering Centers (IMCC), which further boosts
classification performance. Multiple experiments on nine data sets representing
a wide variety of multilabel tasks demonstrate the robustness of our best
ensemble, which is shown to outperform the state-of-the-art. The MATLAB code
for generating the best ensembles in the experimental section will be available
at https://github.com/LorisNanni.
Related papers
- A data-centric approach for assessing progress of Graph Neural Networks [7.2249434861826325]
Graph Neural Networks (GNNs) have achieved state-of-the-art results in node classification tasks.
Most improvements are in multi-class classification, with less focus on the cases where each node could have multiple labels.
First challenge in studying multi-label node classification is the scarcity of publicly available datasets.
arXiv Detail & Related papers (2024-06-18T09:41:40Z) - Convolutional autoencoder-based multimodal one-class classification [80.52334952912808]
One-class classification refers to approaches of learning using data from a single class only.
We propose a deep learning one-class classification method suitable for multimodal data.
arXiv Detail & Related papers (2023-09-25T12:31:18Z) - Reliable Representations Learning for Incomplete Multi-View Partial Multi-Label Classification [78.15629210659516]
In this paper, we propose an incomplete multi-view partial multi-label classification network named RANK.
We break through the view-level weights inherent in existing methods and propose a quality-aware sub-network to dynamically assign quality scores to each view of each sample.
Our model is not only able to handle complete multi-view multi-label datasets, but also works on datasets with missing instances and labels.
arXiv Detail & Related papers (2023-03-30T03:09:25Z) - WLD-Reg: A Data-dependent Within-layer Diversity Regularizer [98.78384185493624]
Neural networks are composed of multiple layers arranged in a hierarchical structure jointly trained with a gradient-based optimization.
We propose to complement this traditional 'between-layer' feedback with additional 'within-layer' feedback to encourage the diversity of the activations within the same layer.
We present an extensive empirical study confirming that the proposed approach enhances the performance of several state-of-the-art neural network models in multiple tasks.
arXiv Detail & Related papers (2023-01-03T20:57:22Z) - On Robust Learning from Noisy Labels: A Permutation Layer Approach [53.798757734297986]
This paper introduces a permutation layer learning approach termed PermLL to dynamically calibrate the training process of a deep neural network (DNN)
We provide two variants of PermLL in this paper: one applies the permutation layer to the model's prediction, while the other applies it directly to the given noisy label.
We validate PermLL experimentally and show that it achieves state-of-the-art performance on both real and synthetic datasets.
arXiv Detail & Related papers (2022-11-29T03:01:48Z) - Evolving Multi-Label Fuzzy Classifier [5.53329677986653]
Multi-label classification has attracted much attention in the machine learning community to address the problem of assigning single samples to more than one class at the same time.
We propose an evolving multi-label fuzzy classifier (EFC-ML) which is able to self-adapt and self-evolve its structure with new incoming multi-label samples in an incremental, single-pass manner.
arXiv Detail & Related papers (2022-03-29T08:01:03Z) - Gaussian Mixture Variational Autoencoder with Contrastive Learning for
Multi-Label Classification [27.043136219527767]
We propose a novel contrastive learning boosted multi-label prediction model.
By using contrastive learning in the supervised setting, we can exploit label information effectively.
We show that the learnt embeddings provide insights into the interpretation of label-label interactions.
arXiv Detail & Related papers (2021-12-02T04:23:34Z) - Multi-Scale Label Relation Learning for Multi-Label Classification Using
1-Dimensional Convolutional Neural Networks [0.5801044612920815]
We present Multi-Scale Label Dependence Relation Networks (MSDN), a novel approach to multi-label classification (MLC)
MSDN uses 1-dimensional convolution kernels to learn label dependencies at multi-scale.
We demonstrate that our model can achieve better accuracies with much smaller number of model parameters compared to RNN-based MLC models.
arXiv Detail & Related papers (2021-07-13T09:26:34Z) - Improving Calibration for Long-Tailed Recognition [68.32848696795519]
We propose two methods to improve calibration and performance in such scenarios.
For dataset bias due to different samplers, we propose shifted batch normalization.
Our proposed methods set new records on multiple popular long-tailed recognition benchmark datasets.
arXiv Detail & Related papers (2021-04-01T13:55:21Z) - RandomForestMLP: An Ensemble-Based Multi-Layer Perceptron Against Curse
of Dimensionality [0.0]
We present a novel and practical deep learning pipeline termed RandomForestMLP.
This core trainable classification engine consists of a convolutional neural network backbone followed by an ensemble-based multi-layer perceptrons core for the classification task.
arXiv Detail & Related papers (2020-11-02T18:25:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.