Neural Networks as Functional Classifiers
- URL: http://arxiv.org/abs/2010.04305v1
- Date: Fri, 9 Oct 2020 00:11:01 GMT
- Title: Neural Networks as Functional Classifiers
- Authors: Barinder Thind, Kevin Multani, Jiguo Cao
- Abstract summary: We extend notable deep learning methodologies to the domain of functional data for the purpose of classification problems.
We highlight the effectiveness of our method in a number of classification applications such as classification of spectrographic data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, there has been considerable innovation in the world of
predictive methodologies. This is evident by the relative domination of machine
learning approaches in various classification competitions. While these
algorithms have excelled at multivariate problems, they have remained dormant
in the realm of functional data analysis. We extend notable deep learning
methodologies to the domain of functional data for the purpose of
classification problems. We highlight the effectiveness of our method in a
number of classification applications such as classification of spectrographic
data. Moreover, we demonstrate the performance of our classifier through
simulation studies in which we compare our approach to the functional linear
model and other conventional classification methods.
Related papers
- MISS: Multiclass Interpretable Scoring Systems [13.902264070785986]
We present a machine-learning approach for constructing Multiclass Interpretable Scoring Systems (MISS)
MISS is a fully data-driven methodology for single, sparse, and user-friendly scoring systems for multiclass classification problems.
Results indicate that our approach is competitive with other machine learning models in terms of classification performance metrics and provides well-calibrated class probabilities.
arXiv Detail & Related papers (2024-01-10T10:57:12Z) - Latent class analysis by regularized spectral clustering [0.0]
We propose two new algorithms to estimate a latent class model for categorical data.
Our algorithms are developed by using a newly defined regularized Laplacian matrix calculated from the response matrix.
We further apply our algorithms to real-world categorical data with promising results.
arXiv Detail & Related papers (2023-10-28T15:09:08Z) - Multiclass classification for multidimensional functional data through
deep neural networks [0.22843885788439797]
We introduce a novel functional deep neural network (mfDNN) as an innovative data mining classification tool.
We consider sparse deep neural network architecture with linear unit (ReLU) activation function and minimize the cross-entropy loss in the multiclass classification setup.
We demonstrate the performance of mfDNN on simulated data and several benchmark datasets from different application domains.
arXiv Detail & Related papers (2023-05-22T16:56:01Z) - On Generalizing Beyond Domains in Cross-Domain Continual Learning [91.56748415975683]
Deep neural networks often suffer from catastrophic forgetting of previously learned knowledge after learning a new task.
Our proposed approach learns new tasks under domain shift with accuracy boosts up to 10% on challenging datasets such as DomainNet and OfficeHome.
arXiv Detail & Related papers (2022-03-08T09:57:48Z) - Discriminative Attribution from Counterfactuals [64.94009515033984]
We present a method for neural network interpretability by combining feature attribution with counterfactual explanations.
We show that this method can be used to quantitatively evaluate the performance of feature attribution methods in an objective manner.
arXiv Detail & Related papers (2021-09-28T00:53:34Z) - MCDAL: Maximum Classifier Discrepancy for Active Learning [74.73133545019877]
Recent state-of-the-art active learning methods have mostly leveraged Generative Adversarial Networks (GAN) for sample acquisition.
We propose in this paper a novel active learning framework that we call Maximum Discrepancy for Active Learning (MCDAL)
In particular, we utilize two auxiliary classification layers that learn tighter decision boundaries by maximizing the discrepancies among them.
arXiv Detail & Related papers (2021-07-23T06:57:08Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - Visualization of Supervised and Self-Supervised Neural Networks via
Attribution Guided Factorization [87.96102461221415]
We develop an algorithm that provides per-class explainability.
In an extensive battery of experiments, we demonstrate the ability of our methods to class-specific visualization.
arXiv Detail & Related papers (2020-12-03T18:48:39Z) - Theoretical Insights Into Multiclass Classification: A High-dimensional
Asymptotic View [82.80085730891126]
We provide the first modernally precise analysis of linear multiclass classification.
Our analysis reveals that the classification accuracy is highly distribution-dependent.
The insights gained may pave the way for a precise understanding of other classification algorithms.
arXiv Detail & Related papers (2020-11-16T05:17:29Z) - Functorial Manifold Learning [1.14219428942199]
We first characterize manifold learning algorithms as functors that map pseudometric spaces to optimization objectives.
We then use this characterization to prove refinement bounds on manifold learning loss functions and construct a hierarchy of manifold learning algorithms.
We express several popular manifold learning algorithms as functors at different levels of this hierarchy, including Metric Multidimensional Scaling, IsoMap, and UMAP.
arXiv Detail & Related papers (2020-11-15T02:30:23Z) - Deep Inverse Feature Learning: A Representation Learning of Error [6.5358895450258325]
This paper introduces a novel perspective about error in machine learning and proposes inverse feature learning (IFL) as a representation learning approach.
Inverse feature learning method operates based on a deep clustering approach to obtain a qualitative form of the representation of error as features.
The experimental results show that the proposed method leads to promising results in classification and especially in clustering.
arXiv Detail & Related papers (2020-03-09T17:45:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.