Clustering and classification of low-dimensional data in explicit
feature map domain: intraoperative pixel-wise diagnosis of adenocarcinoma of
a colon in a liver
- URL: http://arxiv.org/abs/2203.03636v1
- Date: Mon, 7 Mar 2022 11:56:06 GMT
- Title: Clustering and classification of low-dimensional data in explicit
feature map domain: intraoperative pixel-wise diagnosis of adenocarcinoma of
a colon in a liver
- Authors: Dario Sitnik and Ivica Kopriva
- Abstract summary: This paper explores the approximate explicit feature map (aEFM) transform of low-dimensional data into a low-dimensional subspace in Hilbert space.
With a modest increase in computational complexity, linear algorithms yield improved performance and keep interpretability.
- Score: 0.10152838128195464
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Application of artificial intelligence in medicine brings in highly accurate
predictions achieved by complex models, the reasoning of which is hard to
interpret. Their generalization ability can be reduced because of the lack of
pixel wise annotated images that occurs in frozen section tissue analysis. To
partially overcome this gap, this paper explores the approximate explicit
feature map (aEFM) transform of low-dimensional data into a low-dimensional
subspace in Hilbert space. There, with a modest increase in computational
complexity, linear algorithms yield improved performance and keep
interpretability. They remain amenable to incremental learning that is not a
trivial issue for some nonlinear algorithms. We demonstrate proposed
methodology on a very large-scale problem related to intraoperative pixel-wise
semantic segmentation and clustering of adenocarcinoma of a colon in a liver.
Compared to the results in the input space, logistic classifier achieved
statistically significant performance improvements in micro balanced accuracy
and F1 score in the amounts of 12.04% and 12.58%, respectively. Support vector
machine classifier yielded the increase of 8.04% and 9.41%. For clustering,
increases of 0.79% and 0.85% are obtained with ultra large-scale spectral
clustering algorithm. Results are supported by a discussion of interpretability
using Shapely additive explanation values for predictions of linear classifier
in input space and aEFM induced space.
Related papers
- Unraveling the Impact of Heterophilic Structures on Graph Positive-Unlabeled Learning [71.9954600831939]
Positive-Unlabeled (PU) learning is vital in many real-world scenarios, but its application to graph data remains under-explored.
We unveil that a critical challenge for PU learning on graph lies on the edge heterophily, which directly violates the irreducibility assumption for Class-Prior Estimation.
In response to this challenge, we introduce a new method, named Graph PU Learning with Label Propagation Loss (GPL)
arXiv Detail & Related papers (2024-05-30T10:30:44Z) - Gauge-optimal approximate learning for small data classification
problems [0.0]
Small data learning problems are characterized by a discrepancy between the limited amount of response variable observations and the large feature space dimension.
We propose the Gauge- Optimal Approximate Learning (GOAL) algorithm, which provides an analytically tractable joint solution to the reduction dimension, feature segmentation and classification problems.
GOAL has been compared to other state-of-the-art machine learning (ML) tools on both synthetic data and challenging real-world applications from climate science and bioinformatics.
arXiv Detail & Related papers (2023-10-29T16:46:05Z) - Compound Batch Normalization for Long-tailed Image Classification [77.42829178064807]
We propose a compound batch normalization method based on a Gaussian mixture.
It can model the feature space more comprehensively and reduce the dominance of head classes.
The proposed method outperforms existing methods on long-tailed image classification.
arXiv Detail & Related papers (2022-12-02T07:31:39Z) - A new filter for dimensionality reduction and classification of
hyperspectral images using GLCM features and mutual information [0.0]
We introduce a new methodology for dimensionality reduction and classification of hyperspectral images.
We take into account both spectral and spatial information based on mutual information.
Experiments are performed on three well-known hyperspectral benchmark datasets.
arXiv Detail & Related papers (2022-11-01T13:19:08Z) - Fuzzy Attention Neural Network to Tackle Discontinuity in Airway
Segmentation [67.19443246236048]
Airway segmentation is crucial for the examination, diagnosis, and prognosis of lung diseases.
Some small-sized airway branches (e.g., bronchus and terminaloles) significantly aggravate the difficulty of automatic segmentation.
This paper presents an efficient method for airway segmentation, comprising a novel fuzzy attention neural network and a comprehensive loss function.
arXiv Detail & Related papers (2022-09-05T16:38:13Z) - Bioinspired random projections for robust, sparse classification [0.0]
Inspired by the use of random projections in biological sensing systems, we present a new algorithm for processing data in classification problems.
This is based on observations of the human brain and the fruit fly's olfactory system and involves randomly projecting data into a space of greatly increased dimension before applying a cap operation to truncate the smaller entries.
arXiv Detail & Related papers (2022-06-18T15:24:20Z) - Large-Margin Representation Learning for Texture Classification [67.94823375350433]
This paper presents a novel approach combining convolutional layers (CLs) and large-margin metric learning for training supervised models on small datasets for texture classification.
The experimental results on texture and histopathologic image datasets have shown that the proposed approach achieves competitive accuracy with lower computational cost and faster convergence when compared to equivalent CNNs.
arXiv Detail & Related papers (2022-06-17T04:07:45Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Multilevel orthogonal Bochner function subspaces with applications to
robust machine learning [1.533771872970755]
We consider the data as instances of a random field within a relevant Bochner space.
Our key observation is that the classes can predominantly reside in two distinct subspaces.
arXiv Detail & Related papers (2021-10-04T22:01:01Z) - Dimensionality Reduction via Diffusion Map Improved with Supervised
Linear Projection [1.7513645771137178]
In this paper, we assume the data samples lie on a single underlying smooth manifold.
We define intra-class and inter-class similarities using pairwise local kernel distances.
We aim to find a linear projection to maximize the intra-class similarities and minimize the inter-class similarities simultaneously.
arXiv Detail & Related papers (2020-08-08T04:26:07Z) - Improved Slice-wise Tumour Detection in Brain MRIs by Computing
Dissimilarities between Latent Representations [68.8204255655161]
Anomaly detection for Magnetic Resonance Images (MRIs) can be solved with unsupervised methods.
We have proposed a slice-wise semi-supervised method for tumour detection based on the computation of a dissimilarity function in the latent space of a Variational AutoEncoder.
We show that by training the models on higher resolution images and by improving the quality of the reconstructions, we obtain results which are comparable with different baselines.
arXiv Detail & Related papers (2020-07-24T14:02:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.