Correntropy-Based Logistic Regression with Automatic Relevance
Determination for Robust Sparse Brain Activity Decoding
- URL: http://arxiv.org/abs/2207.09693v1
- Date: Wed, 20 Jul 2022 06:49:23 GMT
- Title: Correntropy-Based Logistic Regression with Automatic Relevance
Determination for Robust Sparse Brain Activity Decoding
- Authors: Yuanhao Li, Badong Chen, Yuxi Shi, Natsue Yoshimura, Yasuharu Koike
- Abstract summary: We introduce the correntropy learning framework into the automatic relevance determination based sparse classification model.
We evaluate it on a synthetic dataset, an electroencephalogram (EEG) dataset, and a functional magnetic resonance imaging (fMRI) dataset.
- Score: 18.327196310636864
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent studies have utilized sparse classifications to predict categorical
variables from high-dimensional brain activity signals to expose human's
intentions and mental states, selecting the relevant features automatically in
the model training process. However, existing sparse classification models will
likely be prone to the performance degradation which is caused by noise
inherent in the brain recordings. To address this issue, we aim to propose a
new robust and sparse classification algorithm in this study. To this end, we
introduce the correntropy learning framework into the automatic relevance
determination based sparse classification model, proposing a new
correntropy-based robust sparse logistic regression algorithm. To demonstrate
the superior brain activity decoding performance of the proposed algorithm, we
evaluate it on a synthetic dataset, an electroencephalogram (EEG) dataset, and
a functional magnetic resonance imaging (fMRI) dataset. The extensive
experimental results confirm that not only the proposed method can achieve
higher classification accuracy in a noisy and high-dimensional classification
task, but also it would select those more informative features for the decoding
scenarios. Integrating the correntropy learning approach with the automatic
relevance determination technique will significantly improve the robustness
with respect to the noise, leading to more adequate robust sparse brain
decoding algorithm. It provides a more powerful approach in the real-world
brain activity decoding and the brain-computer interfaces.
Related papers
- Fine-tuning -- a Transfer Learning approach [0.22344294014777952]
Missingness in Electronic Health Records (EHRs) is often hampered by the abundance of missing data in this valuable resource.
Existing deep imputation methods rely on end-to-end pipelines that incorporate both imputation and downstream analyses.
This paper explores the development of a modular, deep learning-based imputation and classification pipeline.
arXiv Detail & Related papers (2024-11-06T14:18:23Z) - Sparse Bayesian Correntropy Learning for Robust Muscle Activity Reconstruction from Noisy Brain Recordings [16.788501453001395]
We propose a new robust implementation for sparse Bayesian learning, so that robustness and sparseness can be realized simultaneously.
Motivated by the great robustness of maximum correntropy criterion (MCC), we proposed an integration of MCC into the sparse Bayesian learning regime.
To fully evaluate the proposed method, a synthetic dataset and a real-world muscle activity reconstruction task with two different brain modalities were employed.
arXiv Detail & Related papers (2024-04-01T08:16:15Z) - Deep Learning for real-time neural decoding of grasp [0.0]
We present a Deep Learning-based approach to the decoding of neural signals for grasp type classification.
The main goal of the presented approach is to improve over state-of-the-art decoding accuracy without relying on any prior neuroscience knowledge.
arXiv Detail & Related papers (2023-11-02T08:26:29Z) - Bandit-Driven Batch Selection for Robust Learning under Label Noise [20.202806541218944]
We introduce a novel approach for batch selection in Gradient Descent (SGD) training, leveraging bandit algorithms.
Our methodology focuses on optimizing the learning process in the presence of label noise, a prevalent issue in real-world datasets.
arXiv Detail & Related papers (2023-10-31T19:19:01Z) - Training neural networks with structured noise improves classification and generalization [0.0]
We show how adding structure to noisy training data can substantially improve the algorithm performance.
We also prove that the so-called Hebbian Unlearning rule coincides with the training-with-noise algorithm when noise is maximal.
arXiv Detail & Related papers (2023-02-26T22:10:23Z) - Low-Resource Music Genre Classification with Cross-Modal Neural Model
Reprogramming [129.4950757742912]
We introduce a novel method for leveraging pre-trained models for low-resource (music) classification based on the concept of Neural Model Reprogramming (NMR)
NMR aims at re-purposing a pre-trained model from a source domain to a target domain by modifying the input of a frozen pre-trained model.
Experimental results suggest that a neural model pre-trained on large-scale datasets can successfully perform music genre classification by using this reprogramming method.
arXiv Detail & Related papers (2022-11-02T17:38:33Z) - Continual Learning For On-Device Environmental Sound Classification [63.81276321857279]
We propose a simple and efficient continual learning method for on-device environmental sound classification.
Our method selects the historical data for the training by measuring the per-sample classification uncertainty.
arXiv Detail & Related papers (2022-07-15T12:13:04Z) - Improving robustness of jet tagging algorithms with adversarial training [56.79800815519762]
We investigate the vulnerability of flavor tagging algorithms via application of adversarial attacks.
We present an adversarial training strategy that mitigates the impact of such simulated attacks.
arXiv Detail & Related papers (2022-03-25T19:57:19Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - Ensemble Wrapper Subsampling for Deep Modulation Classification [70.91089216571035]
Subsampling of received wireless signals is important for relaxing hardware requirements as well as the computational cost of signal processing algorithms.
We propose a subsampling technique to facilitate the use of deep learning for automatic modulation classification in wireless communication systems.
arXiv Detail & Related papers (2020-05-10T06:11:13Z) - Rectified Meta-Learning from Noisy Labels for Robust Image-based Plant
Disease Diagnosis [64.82680813427054]
Plant diseases serve as one of main threats to food security and crop production.
One popular approach is to transform this problem as a leaf image classification task, which can be addressed by the powerful convolutional neural networks (CNNs)
We propose a novel framework that incorporates rectified meta-learning module into common CNN paradigm to train a noise-robust deep network without using extra supervision information.
arXiv Detail & Related papers (2020-03-17T09:51:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.