Sparse Multitask Learning for Efficient Neural Representation of Motor
Imagery and Execution
- URL: http://arxiv.org/abs/2312.05828v1
- Date: Sun, 10 Dec 2023 09:06:16 GMT
- Title: Sparse Multitask Learning for Efficient Neural Representation of Motor
Imagery and Execution
- Authors: Hye-Bin Shin, Kang Yin, Seong-Whan Lee
- Abstract summary: We introduce a sparse multitask learning framework for motor imagery (MI) and motor execution (ME) tasks.
Given a dual-task CNN model for MI-ME classification, we apply a saliency-based sparsification approach to prune superfluous connections.
Our results indicate that this tailored sparsity can mitigate the overfitting problem and improve the test performance with small amount of data.
- Score: 30.186917337606477
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the quest for efficient neural network models for neural data
interpretation and user intent classification in brain-computer interfaces
(BCIs), learning meaningful sparse representations of the underlying neural
subspaces is crucial. The present study introduces a sparse multitask learning
framework for motor imagery (MI) and motor execution (ME) tasks, inspired by
the natural partitioning of associated neural subspaces observed in the human
brain. Given a dual-task CNN model for MI-ME classification, we apply a
saliency-based sparsification approach to prune superfluous connections and
reinforce those that show high importance in both tasks. Through our approach,
we seek to elucidate the distinct and common neural ensembles associated with
each task, employing principled sparsification techniques to eliminate
redundant connections and boost the fidelity of neural signal decoding. Our
results indicate that this tailored sparsity can mitigate the overfitting
problem and improve the test performance with small amount of data, suggesting
a viable path forward for computationally efficient and robust BCI systems.
Related papers
- Growing Deep Neural Network Considering with Similarity between Neurons [4.32776344138537]
We explore a novel approach of progressively increasing neuron numbers in compact models during training phases.
We propose a method that reduces feature extraction biases and neuronal redundancy by introducing constraints based on neuron similarity distributions.
Results on CIFAR-10 and CIFAR-100 datasets demonstrated accuracy improvement.
arXiv Detail & Related papers (2024-08-23T11:16:37Z) - Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - Spiking representation learning for associative memories [0.0]
We introduce a novel artificial spiking neural network (SNN) that performs unsupervised representation learning and associative memory operations.
The architecture of our model derives from the neocortical columnar organization and combines feedforward projections for learning hidden representations and recurrent projections for forming associative memories.
arXiv Detail & Related papers (2024-06-05T08:30:11Z) - Understanding Auditory Evoked Brain Signal via Physics-informed Embedding Network with Multi-Task Transformer [3.261870217889503]
We propose an innovative multi-task learning model, Physics-informed Embedding Network with Multi-Task Transformer (PEMT-Net)
PEMT-Net enhances decoding performance through physics-informed embedding and deep learning techniques.
Experiments on a specific dataset demonstrate PEMT-Net's significant performance in multi-task auditory signal decoding.
arXiv Detail & Related papers (2024-06-04T06:53:32Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Simple and Effective Transfer Learning for Neuro-Symbolic Integration [50.592338727912946]
A potential solution to this issue is Neuro-Symbolic Integration (NeSy), where neural approaches are combined with symbolic reasoning.
Most of these methods exploit a neural network to map perceptions to symbols and a logical reasoner to predict the output of the downstream task.
They suffer from several issues, including slow convergence, learning difficulties with complex perception tasks, and convergence to local minima.
This paper proposes a simple yet effective method to ameliorate these problems.
arXiv Detail & Related papers (2024-02-21T15:51:01Z) - A Spiking Binary Neuron -- Detector of Causal Links [0.0]
Causal relationship recognition is a fundamental operation in neural networks aimed at learning behavior, action planning, and inferring external world dynamics.
This research paper presents a novel approach to realize causal relationship recognition using a simple spiking binary neuron.
arXiv Detail & Related papers (2023-09-15T15:34:17Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - LOCUS: A Novel Decomposition Method for Brain Network Connectivity
Matrices using Low-rank Structure with Uniform Sparsity [8.105772140598056]
Network-oriented research has been increasingly popular in many scientific areas.
In neuroscience research, imaging-based network connectivity measures have become the key for brain organizations.
arXiv Detail & Related papers (2020-08-19T05:47:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.