Classification of fNIRS Data Under Uncertainty: A Bayesian Neural
Network Approach
- URL: http://arxiv.org/abs/2101.07128v1
- Date: Mon, 18 Jan 2021 15:43:59 GMT
- Title: Classification of fNIRS Data Under Uncertainty: A Bayesian Neural
Network Approach
- Authors: Talha Siddique and Md Shaad Mahmud
- Abstract summary: We use a Bayesian Neural Network (BNN) to carry out a binary classification on an open-access dataset.
Our model produced an overall classification accuracy of 86.44% over 30 volunteers.
- Score: 0.15229257192293197
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Functional Near-Infrared Spectroscopy (fNIRS) is a non-invasive form of
Brain-Computer Interface (BCI). It is used for the imaging of brain
hemodynamics and has gained popularity due to the certain pros it poses over
other similar technologies. The overall functionalities encompass the capture,
processing and classification of brain signals. Since hemodynamic responses are
contaminated by physiological noises, several methods have been implemented in
the past literature to classify the responses in focus from the unwanted ones.
However, the methods, thus far does not take into consideration the uncertainty
in the data or model parameters. In this paper, we use a Bayesian Neural
Network (BNN) to carry out a binary classification on an open-access dataset,
consisting of unilateral finger tapping (left- and right-hand finger tapping).
A BNN uses Bayesian statistics to assign a probability distribution to the
network weights instead of a point estimate. In this way, it takes data and
model uncertainty into consideration while carrying out the classification. We
used Variational Inference (VI) to train our model. Our model produced an
overall classification accuracy of 86.44% over 30 volunteers. We illustrated
how the evidence lower bound (ELBO) function of the model converges over
iterations. We further illustrated the uncertainty that is inherent during the
sampling of the posterior distribution of the weights. We also generated a ROC
curve for our BNN classifier using test data from a single volunteer and our
model has an AUC score of 0.855.
Related papers
- Evidence Networks: simple losses for fast, amortized, neural Bayesian
model comparison [0.0]
Evidence Networks can enable Bayesian model comparison when state-of-the-art methods fail.
We introduce the leaky parity-odd power transform, leading to the novel l-POP-Exponential'' loss function.
We show that Evidence Networks are explicitly independent of dimensionality of the parameter space and scale mildly with the complexity of the posterior probability density function.
arXiv Detail & Related papers (2023-05-18T18:14:53Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Functional Neural Networks: Shift invariant models for functional data
with applications to EEG classification [0.0]
We introduce a new class of neural networks that are shift invariant and preserve smoothness of the data: functional neural networks (FNNs)
For this, we use methods from functional data analysis (FDA) to extend multi-layer perceptrons and convolutional neural networks to functional data.
We show that the models outperform a benchmark model from FDA in terms of accuracy and successfully use FNNs to classify electroencephalography (EEG) data.
arXiv Detail & Related papers (2023-01-14T09:41:21Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Kalman Bayesian Neural Networks for Closed-form Online Learning [5.220940151628734]
We propose a novel approach for BNN learning via closed-form Bayesian inference.
The calculation of the predictive distribution of the output and the update of the weight distribution are treated as Bayesian filtering and smoothing problems.
This allows closed-form expressions for training the network's parameters in a sequential/online fashion without gradient descent.
arXiv Detail & Related papers (2021-10-03T07:29:57Z) - The Causal Neural Connection: Expressiveness, Learnability, and
Inference [125.57815987218756]
An object called structural causal model (SCM) represents a collection of mechanisms and sources of random variation of the system under investigation.
In this paper, we show that the causal hierarchy theorem (Thm. 1, Bareinboim et al., 2020) still holds for neural models.
We introduce a special type of SCM called a neural causal model (NCM), and formalize a new type of inductive bias to encode structural constraints necessary for performing causal inferences.
arXiv Detail & Related papers (2021-07-02T01:55:18Z) - DAAIN: Detection of Anomalous and Adversarial Input using Normalizing
Flows [52.31831255787147]
We introduce a novel technique, DAAIN, to detect out-of-distribution (OOD) inputs and adversarial attacks (AA)
Our approach monitors the inner workings of a neural network and learns a density estimator of the activation distribution.
Our model can be trained on a single GPU making it compute efficient and deployable without requiring specialized accelerators.
arXiv Detail & Related papers (2021-05-30T22:07:13Z) - Neurological Status Classification Using Convolutional Neural Network [0.0]
We show that a Convolutional Neural Network (CNN) model is able to accuratelydiscriminate between 4 different phases of neurological status.
We demonstrate that the proposed model is able to obtain 99.99% AreaUnder the Curve (AUC) of Receiver Operation characteristic (ROC) and 99.82% classificationaccuracy on the test dataset.
arXiv Detail & Related papers (2021-04-01T22:40:28Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Bayesian Neural Network via Stochastic Gradient Descent [0.0]
We show how gradient estimation can be applied on bayesian neural networks by gradient estimation techniques.
Our work considerably beats the previous state of the art approaches for regression using bayesian neural networks.
arXiv Detail & Related papers (2020-06-04T18:33:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.