EEG-based AI-BCI Wheelchair Advancement: A Brain-Computer Interfacing Wheelchair System Using Machine Learning Mechanism with Right and Left Voluntary Hand Movement
- URL: http://arxiv.org/abs/2410.09763v1
- Date: Sun, 13 Oct 2024 07:41:37 GMT
- Title: EEG-based AI-BCI Wheelchair Advancement: A Brain-Computer Interfacing Wheelchair System Using Machine Learning Mechanism with Right and Left Voluntary Hand Movement
- Authors: Biplov Paneru, Bishwash Paneru, Khem Narayan Poudyal,
- Abstract summary: The system is designed to simulate wheelchair navigation based on voluntary right and left-hand movements.
Various machine learning models, including Support Vector Machines (SVM), XGBoost, random forest, and a Bi-directional Long Short-Term Memory (Bi-LSTM) attention-based model, were developed.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents an Artificial Intelligence (AI) integrated novel approach to Brain-Computer Interface (BCI)-based wheelchair development, utilizing a voluntary Right Left Hand Movement mechanism for control. The system is designed to simulate wheelchair navigation based on voluntary right and left-hand movements using electroencephalogram (EEG) data. A pre-filtered dataset, obtained from an open-source EEG repository, was segmented into arrays of 19x200 to capture the onset of hand movements. The data was acquired at a sampling frequency 200Hz in the laboratory experiment. The system integrates a Tkinter-based interface for simulating wheelchair movements, offering users a functional and intuitive control system. Various machine learning models, including Support Vector Machines (SVM), XGBoost, random forest, and a Bi-directional Long Short-Term Memory (Bi-LSTM) attention-based model, were developed. The random forest model obtained 79% accuracy. Great performance was seen on the Logistic Regression model which outperforms other models with 92% accuracy and 91% accuracy on the Multi-Layer Perceptron (MLP) model. The Bi-LSTM attention-based model achieved a mean accuracy of 86% through cross-validation, showcasing the potential of attention mechanisms in BCI applications.
Related papers
- Helpful DoggyBot: Open-World Object Fetching using Legged Robots and Vision-Language Models [63.89598561397856]
We present a system for quadrupedal mobile manipulation in indoor environments.
It uses a front-mounted gripper for object manipulation, a low-level controller trained in simulation using egocentric depth for agile skills.
We evaluate our system in two unseen environments without any real-world data collection or training.
arXiv Detail & Related papers (2024-09-30T20:58:38Z) - Battle of the Backbones: A Large-Scale Comparison of Pretrained Models
across Computer Vision Tasks [139.3768582233067]
Battle of the Backbones (BoB) is a benchmarking tool for neural network based computer vision systems.
We find that vision transformers (ViTs) and self-supervised learning (SSL) are increasingly popular.
In apples-to-apples comparisons on the same architectures and similarly sized pretraining datasets, we find that SSL backbones are highly competitive.
arXiv Detail & Related papers (2023-10-30T18:23:58Z) - Sequential Best-Arm Identification with Application to Brain-Computer
Interface [34.87975833920409]
A brain-computer interface (BCI) is a technology that enables direct communication between the brain and an external device or computer system.
An electroencephalogram (EEG) and event-related potential (ERP)-based speller system is a type of BCI that allows users to spell words without using a physical keyboard.
We propose a sequential top-two Thompson sampling (STTS) algorithm under the fixed-confidence setting and the fixed-budget setting.
arXiv Detail & Related papers (2023-05-17T18:49:44Z) - FastRLAP: A System for Learning High-Speed Driving via Deep RL and
Autonomous Practicing [71.76084256567599]
We present a system that enables an autonomous small-scale RC car to drive aggressively from visual observations using reinforcement learning (RL)
Our system, FastRLAP (faster lap), trains autonomously in the real world, without human interventions, and without requiring any simulation or expert demonstrations.
The resulting policies exhibit emergent aggressive driving skills, such as timing braking and acceleration around turns and avoiding areas which impede the robot's motion, approaching the performance of a human driver using a similar first-person interface over the course of training.
arXiv Detail & Related papers (2023-04-19T17:33:47Z) - Hybrid Paradigm-based Brain-Computer Interface for Robotic Arm Control [0.9176056742068814]
Brain-computer interface (BCI) uses brain signals to communicate with external devices without actual control.
We propose a knowledge distillation-based framework to manipulate robotic arm through hybrid paradigm induced EEG signals for practical use.
arXiv Detail & Related papers (2022-12-14T08:13:10Z) - FingerFlex: Inferring Finger Trajectories from ECoG signals [68.8204255655161]
FingerFlex model is a convolutional encoder-decoder architecture adapted for finger movement regression on electrocorticographic (ECoG) brain data.
State-of-the-art performance was achieved on a publicly available BCI competition IV dataset 4 with a correlation coefficient between true and predicted trajectories up to 0.74.
arXiv Detail & Related papers (2022-10-23T16:26:01Z) - Toward smart composites: small-scale, untethered prediction and control
for soft sensor/actuator systems [0.6465251961564604]
We present a suite of algorithms and tools for model-predictive control of sensor/actuator systems with embedded microcontroller units (MCU)
These MCUs can be colocated with sensors and actuators, enabling a new class of smart composites capable of autonomous behavior.
Online Newton-Raphson optimization solves for the control input.
arXiv Detail & Related papers (2022-05-22T22:19:09Z) - Bayesian Optimization and Deep Learning forsteering wheel angle
prediction [58.720142291102135]
This work aims to obtain an accurate model for the prediction of the steering angle in an automated driving system.
BO was able to identify, within a limited number of trials, a model -- namely BOST-LSTM -- which resulted, the most accurate when compared to classical end-to-end driving models.
arXiv Detail & Related papers (2021-10-22T15:25:14Z) - Wheelchair automation by a hybrid BCI system using SSVEP and eye blinks [1.1099588962062936]
The prototype is based on a combined mechanism of steady-state visually evoked potential and eye blinks.
The prototype can be used efficiently in a home environment without causing any discomfort to the user.
arXiv Detail & Related papers (2021-06-10T08:02:31Z) - BeCAPTCHA-Mouse: Synthetic Mouse Trajectories and Improved Bot Detection [78.11535724645702]
We present BeCAPTCHA-Mouse, a bot detector based on a neuromotor model of mouse dynamics.
BeCAPTCHA-Mouse is able to detect bot trajectories of high realism with 93% of accuracy in average using only one mouse trajectory.
arXiv Detail & Related papers (2020-05-02T17:40:49Z) - Brain-based control of car infotainment [0.0]
We present a custom portable EEG-based Brain-Computer Interface (BCI) that exploits Event-Related Potentials (ERPs) induced with an oddball experimental paradigm to control the infotainment menu of a car.
Subject-specific models were trained with different machine learning approaches to classify EEG responses to target and non-target stimuli.
No statistical differences were observed between the CAs for the in-lab and in-car training sets, nor between the EEG responses in these conditions.
arXiv Detail & Related papers (2020-04-24T20:32:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.