EEG-based AI-BCI Wheelchair Advancement: A Brain-Computer Interfacing Wheelchair System Using Deep Learning Approach
- URL: http://arxiv.org/abs/2410.09763v3
- Date: Sun, 12 Jan 2025 15:56:53 GMT
- Title: EEG-based AI-BCI Wheelchair Advancement: A Brain-Computer Interfacing Wheelchair System Using Deep Learning Approach
- Authors: Biplov Paneru, Bishwash Paneru, Bipul Thapa, Khem Narayan Poudyal,
- Abstract summary: This study offers a revolutionary strategy to developing wheelchairs based on the Brain-Computer Interface (BCI) that incorporates Artificial Intelligence (AI)
The device uses electroencephalogram (EEG) data to mimic wheelchair navigation.
- Score: 0.0
- License:
- Abstract: This study offers a revolutionary strategy to developing wheelchairs based on the Brain-Computer Interface (BCI) that incorporates Artificial Intelligence (AI) using a The device uses electroencephalogram (EEG) data to mimic wheelchair navigation. Five different models were trained on a pre-filtered dataset that was divided into fixed-length windows using a sliding window technique. Each window contained statistical measurements, FFT coefficients for different frequency bands, and a label identifying the activity carried out during that window that was taken from an open-source Kaggle repository. The XGBoost model outperformed the other models, CatBoost, GRU, SVC, and XGBoost, with an accuracy of 60%. The CatBoost model with a major difference between training and testing accuracy shows overfitting, and similarly, the best-performing model, with SVC, was implemented in a tkinter GUI. The wheelchair movement could be simulated in various directions, and a Raspberry Pi-powered wheelchair system for brain-computer interface is proposed here.
Related papers
- Helpful DoggyBot: Open-World Object Fetching using Legged Robots and Vision-Language Models [63.89598561397856]
We present a system for quadrupedal mobile manipulation in indoor environments.
It uses a front-mounted gripper for object manipulation, a low-level controller trained in simulation using egocentric depth for agile skills.
We evaluate our system in two unseen environments without any real-world data collection or training.
arXiv Detail & Related papers (2024-09-30T20:58:38Z) - Battle of the Backbones: A Large-Scale Comparison of Pretrained Models
across Computer Vision Tasks [139.3768582233067]
Battle of the Backbones (BoB) is a benchmarking tool for neural network based computer vision systems.
We find that vision transformers (ViTs) and self-supervised learning (SSL) are increasingly popular.
In apples-to-apples comparisons on the same architectures and similarly sized pretraining datasets, we find that SSL backbones are highly competitive.
arXiv Detail & Related papers (2023-10-30T18:23:58Z) - Sequential Best-Arm Identification with Application to Brain-Computer
Interface [34.87975833920409]
A brain-computer interface (BCI) is a technology that enables direct communication between the brain and an external device or computer system.
An electroencephalogram (EEG) and event-related potential (ERP)-based speller system is a type of BCI that allows users to spell words without using a physical keyboard.
We propose a sequential top-two Thompson sampling (STTS) algorithm under the fixed-confidence setting and the fixed-budget setting.
arXiv Detail & Related papers (2023-05-17T18:49:44Z) - FastRLAP: A System for Learning High-Speed Driving via Deep RL and
Autonomous Practicing [71.76084256567599]
We present a system that enables an autonomous small-scale RC car to drive aggressively from visual observations using reinforcement learning (RL)
Our system, FastRLAP (faster lap), trains autonomously in the real world, without human interventions, and without requiring any simulation or expert demonstrations.
The resulting policies exhibit emergent aggressive driving skills, such as timing braking and acceleration around turns and avoiding areas which impede the robot's motion, approaching the performance of a human driver using a similar first-person interface over the course of training.
arXiv Detail & Related papers (2023-04-19T17:33:47Z) - Hybrid Paradigm-based Brain-Computer Interface for Robotic Arm Control [0.9176056742068814]
Brain-computer interface (BCI) uses brain signals to communicate with external devices without actual control.
We propose a knowledge distillation-based framework to manipulate robotic arm through hybrid paradigm induced EEG signals for practical use.
arXiv Detail & Related papers (2022-12-14T08:13:10Z) - FingerFlex: Inferring Finger Trajectories from ECoG signals [68.8204255655161]
FingerFlex model is a convolutional encoder-decoder architecture adapted for finger movement regression on electrocorticographic (ECoG) brain data.
State-of-the-art performance was achieved on a publicly available BCI competition IV dataset 4 with a correlation coefficient between true and predicted trajectories up to 0.74.
arXiv Detail & Related papers (2022-10-23T16:26:01Z) - Toward smart composites: small-scale, untethered prediction and control
for soft sensor/actuator systems [0.6465251961564604]
We present a suite of algorithms and tools for model-predictive control of sensor/actuator systems with embedded microcontroller units (MCU)
These MCUs can be colocated with sensors and actuators, enabling a new class of smart composites capable of autonomous behavior.
Online Newton-Raphson optimization solves for the control input.
arXiv Detail & Related papers (2022-05-22T22:19:09Z) - Bayesian Optimization and Deep Learning forsteering wheel angle
prediction [58.720142291102135]
This work aims to obtain an accurate model for the prediction of the steering angle in an automated driving system.
BO was able to identify, within a limited number of trials, a model -- namely BOST-LSTM -- which resulted, the most accurate when compared to classical end-to-end driving models.
arXiv Detail & Related papers (2021-10-22T15:25:14Z) - Wheelchair automation by a hybrid BCI system using SSVEP and eye blinks [1.1099588962062936]
The prototype is based on a combined mechanism of steady-state visually evoked potential and eye blinks.
The prototype can be used efficiently in a home environment without causing any discomfort to the user.
arXiv Detail & Related papers (2021-06-10T08:02:31Z) - BeCAPTCHA-Mouse: Synthetic Mouse Trajectories and Improved Bot Detection [78.11535724645702]
We present BeCAPTCHA-Mouse, a bot detector based on a neuromotor model of mouse dynamics.
BeCAPTCHA-Mouse is able to detect bot trajectories of high realism with 93% of accuracy in average using only one mouse trajectory.
arXiv Detail & Related papers (2020-05-02T17:40:49Z) - Brain-based control of car infotainment [0.0]
We present a custom portable EEG-based Brain-Computer Interface (BCI) that exploits Event-Related Potentials (ERPs) induced with an oddball experimental paradigm to control the infotainment menu of a car.
Subject-specific models were trained with different machine learning approaches to classify EEG responses to target and non-target stimuli.
No statistical differences were observed between the CAs for the in-lab and in-car training sets, nor between the EEG responses in these conditions.
arXiv Detail & Related papers (2020-04-24T20:32:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.