Enhancing Neuroprosthetic Control Using CNN-LSTM Models: A Simulation Study with EEG-Based Motor Imagery

Document Type : Refereed research papers.

Author

General Systems Engineering, Faculty of Engineering, October University for Modern Sciences and Arts.

Abstract

The development of intuitive and responsive neuroprosthetic systems remains a critical challenge in assistive technologies, particularly in decoding neural signals to enable precise and adaptive motor control. This study addresses the problem of translating EEG-based motor imagery into effective neuroprosthetic control, overcoming challenges such as limited data, overfitting in predictive models, and practical constraints in robotic actuation.
A CNN-LSTM hybrid model was developed to classify motor imagery tasks using EEG signals. The application of data augmentation and regularization techniques improved the model’s performance, achieving a test accuracy of 93.5% and balanced precision and recall across motor imagery tasks. To validate its practical application, a PyBullet-based simulation demonstrated the successful control of a robotic gripper, where the model’s predictions were translated into accurate "open" and "close" actions. The gripper joints performed these actions with high precision, showcasing the system's potential for real-time neuroprosthetic applications. However, constraints such as dataset limitations and simulation-specific constraints underscore the need for further optimization.
This study provides a robust proof-of-concept for integrating deep learning with brain-computer interfaces to achieve adaptive, reliable, and real-time neuroprosthetic control. By addressing key challenges, the proposed framework bridges the gap between neural signal decoding and physical actuation, offering a pathway toward advanced and responsive neuroprosthetic systems

Keywords