EEG-based AI-BCI Wheelchair Advancement: Hybrid Deep Learning with Motor Imagery for Brain Computer Interface
- URL: http://arxiv.org/abs/2509.25667v1
- Date: Tue, 30 Sep 2025 02:06:04 GMT
- Title: EEG-based AI-BCI Wheelchair Advancement: Hybrid Deep Learning with Motor Imagery for Brain Computer Interface
- Authors: Bipul Thapa, Biplov Paneru, Bishwash Paneru, Khem Narayan Poudyal,
- Abstract summary: The system is designed to simulate wheelchair navigation based on motor imagery right and left-hand movements.<n>A BiLSTM-BiGRU model shows a superior test accuracy of 92.26% as compared with various machine learning baseline models.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents an Artificial Intelligence (AI) integrated novel approach to Brain-Computer Interface (BCI)-based wheelchair development, utilizing a motor imagery right-left-hand movement mechanism for control. The system is designed to simulate wheelchair navigation based on motor imagery right and left-hand movements using electroencephalogram (EEG) data. A pre-filtered dataset, obtained from an open-source EEG repository, was segmented into arrays of 19x200 to capture the onset of hand movements. The data was acquired at a sampling frequency of 200Hz. The system integrates a Tkinter-based interface for simulating wheelchair movements, offering users a functional and intuitive control system. We propose a BiLSTM-BiGRU model that shows a superior test accuracy of 92.26% as compared with various machine learning baseline models, including XGBoost, EEGNet, and a transformer-based model. The Bi-LSTM-BiGRU attention-based model achieved a mean accuracy of 90.13% through cross-validation, showcasing the potential of attention mechanisms in BCI applications.
Related papers
- MI-DETR: A Strong Baseline for Moving Infrared Small Target Detection with Bio-Inspired Motion Integration [63.87179575890912]
We propose Motion Integration DETR (MI-DETR), a bio-inspired dual-pathway detector for infrared small target detection.<n>First, a retina-inspired cellular automaton (RCA) converts raw frame sequences into a motion map defined on the same pixel grid as the appearance image.<n>Second, a Parvocellular-Magnocellular Interconnection (PMI) Block facilitates bidirectional feature interaction between the two pathways.
arXiv Detail & Related papers (2026-03-05T11:39:31Z) - A Real-Time BCI for Stroke Hand Rehabilitation Using Latent EEG Features from Healthy Subjects [0.0]
This study presents a real-time, portable brain-computer interface (BCI) system designed to support hand rehabilitation for stroke patients.<n>The system combines a low cost 3D-printed robotic exoskeleton with an embedded controller that converts brain signals into physical hand movements.
arXiv Detail & Related papers (2025-09-07T22:19:03Z) - Neural-Driven Image Editing [51.11173675034121]
Traditional image editing relies on manual prompting, making it labor-intensive and inaccessible to individuals with limited motor control or language abilities.<n>We propose LoongX, a hands-free image editing approach driven by neurophysiological signals.<n>LoongX utilizes state-of-the-art diffusion models trained on a comprehensive dataset of 23,928 image editing pairs.
arXiv Detail & Related papers (2025-07-07T18:31:50Z) - BRAVE: Brain-Controlled Prosthetic Arm with Voice Integration and Embodied Learning for Enhanced Mobility [5.528262076322921]
BRAVE is a hybrid EEG and voice-controlled prosthetic system.<n>It aims to interpret EEG-driven motor intent, enabling movement control without reliance on residual muscle activity.<n>The system operates in real time, with a response latency of 150 ms.
arXiv Detail & Related papers (2025-05-23T11:44:33Z) - EEG-based AI-BCI Wheelchair Advancement: A Brain-Computer Interfacing Wheelchair System Using Deep Learning Approach [0.0]
This study offers a revolutionary strategy to developing wheelchairs based on the Brain-Computer Interface (BCI) that incorporates Artificial Intelligence (AI)<n>The device uses electroencephalogram (EEG) data to mimic wheelchair navigation.
arXiv Detail & Related papers (2024-10-13T07:41:37Z) - EEG Right & Left Voluntary Hand Movement-based Virtual Brain-Computer Interfacing Keyboard Using Hybrid Deep Learning Approach [0.0]
We develop an EEG-based BMI system capable of accurately identifying voluntary keystrokes.<n>Our approach employs a hybrid neural network architecture with BiGRU-Attention as the proposed model for interpreting EEG signals.
arXiv Detail & Related papers (2024-08-18T02:10:29Z) - Robotic Navigation Autonomy for Subretinal Injection via Intelligent
Real-Time Virtual iOCT Volume Slicing [88.99939660183881]
We propose a framework for autonomous robotic navigation for subretinal injection.
Our method consists of an instrument pose estimation method, an online registration between the robotic and the i OCT system, and trajectory planning tailored for navigation to an injection target.
Our experiments on ex-vivo porcine eyes demonstrate the precision and repeatability of the method.
arXiv Detail & Related papers (2023-01-17T21:41:21Z) - FingerFlex: Inferring Finger Trajectories from ECoG signals [68.8204255655161]
FingerFlex model is a convolutional encoder-decoder architecture adapted for finger movement regression on electrocorticographic (ECoG) brain data.
State-of-the-art performance was achieved on a publicly available BCI competition IV dataset 4 with a correlation coefficient between true and predicted trajectories up to 0.74.
arXiv Detail & Related papers (2022-10-23T16:26:01Z) - ProcTHOR: Large-Scale Embodied AI Using Procedural Generation [55.485985317538194]
ProcTHOR is a framework for procedural generation of Embodied AI environments.
We demonstrate state-of-the-art results across 6 embodied AI benchmarks for navigation, rearrangement, and arm manipulation.
arXiv Detail & Related papers (2022-06-14T17:09:35Z) - Wheelchair automation by a hybrid BCI system using SSVEP and eye blinks [1.1099588962062936]
The prototype is based on a combined mechanism of steady-state visually evoked potential and eye blinks.
The prototype can be used efficiently in a home environment without causing any discomfort to the user.
arXiv Detail & Related papers (2021-06-10T08:02:31Z) - A Driving Behavior Recognition Model with Bi-LSTM and Multi-Scale CNN [59.57221522897815]
We propose a neural network model based on trajectories information for driving behavior recognition.
We evaluate the proposed model on the public BLVD dataset, achieving a satisfying performance.
arXiv Detail & Related papers (2021-03-01T06:47:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.