As illustrated in Figure 13, a specific code such as
"111" is transmitted during serial communication,
which corresponds to the "right" mental command.
As a result, the arm will move to the right at a 35-
degree angle. The same procedure was applied for
“drop” mental command.
5 CONCLUSION
This paper highlights the development of a new design
for a Brain-driven Robotic Arm as an advanced
assistive device for individuals with upper limb
amputations. In fact, the convergence of advancements
in brain-computer interfaces (BCI) and robotics has
created a new era of enhanced precision and control for
robotic arms, addressing the pressing need for assistive
devices that offer greater independence and
functionality for individuals with disabilities. The
project aims to provide greater control and
functionality to enhance amputees' independence, by
utilizing brain-computer interface (BCI) technology
and electroencephalogram (EEG) signals. The
proposed experimental design involves utilizing the 6
DOF-Yahboom DOFBOT Robotic Arm Kit, in
combination with the 14-Channel EPOC X EEG
Headset. The system is controlled by Jetson Nano and
programmed using Python, employing a Latent
Dirichlet Allocation (LDA) method for Artificial
intelligence task. Finally, as traditional devices, often
limited by their demand for substantial physical effort
and lack of versatility, fall short of meeting the daily
needs of these individuals, the development of the
proposed robotic arm emerges as a vital solution,
promising to revolutionize the support available to
individuals with severe motor disabilities, including
limb loss. Future work will focus on advancements in
feature extraction techniques for EEG signals to
enhance control accuracy. Specifically, exploring
advanced methods such as time-frequency analysis and
deep learning-based feature extraction holds
significant potential for improving the discrimination
of relevant brain activity patterns.
REFERENCES
Lemaire, E. D., Supan, T. J., & Ortiz, M. (2018). Global
standards for prosthetics and orthotics. In Canadian
Prosthetics & Orthotics Journal, 1(2).
Bharara, M., Mills, J. L., Suresh, K., Rilo, H. L., &
Armstrong, D. G. (2009). Diabetes and landmine‐
related amputations: a call to arms to save limbs. In
International Wound Journal, 6(1), 2.
Shedeed, H. A., Issa, M. F., & El-sayed, S. M. (2013). Brain
EEG signal processing for controlling a robotic arm. In
8th International Conference on Computer Engineering
& Systems (ICCES), pp. 152–157.
Aljedaani, L. T., Abdelhedi, F., Aldahasi, R. A., & Batheeb,
A. A. (2024, April). Design of a Brain Controlled
Robotic Arm: Initial Experimental Testing. In 2024
21st International Multi-Conference on Systems,
Signals & Devices (SSD) (pp. 209-215). IEEE.
Ramadan, R. A., & Vasilakos, A. V. (2017). Brain
computer interface: control signals review. In
Neurocomputing, vol. 223, pp. 26–44.
ZHOU, Y., et al. (2023). Shared three-dimensional robotic
arm control based on asynchronous BCI and computer
vision. In IEEE Transactions on Neural Systems and
Rehabilitation Engineering.
Biasiucci, A., Franceschiello, B., & Murray, M. M. (2019).
Electroencephalography. In Curr. Biol., vol. 29, no. 3,
pp. R80–R85.
Khosla, A., Khandnor, P., & Chand, T. (2020). A
comparative analysis of signal processing and
classification methods for different applications based
on EEG signals. In Biocybern. Biomed. Eng., vol. 40,
no. 2, pp. 649–690.
Yip, D. W., & Lui, F. (2023). Physiology, Motor Cortical.
Rolander, A. (2023). Analyzing the Effects of Non-
Generative Augmentation on Automated Classification
of Brain Tumors.
Huang, Z., & Wang, M. (2021). A review of
electroencephalogram signal processing methods for
brain-controlled robots. In Cognitive Robotics, vol. 1,
pp. 111–124.
Orban, M., Elsamanty, M., Guo, K., Zhang, S., & Yang, H.
(2022). A review of brain activity and EEG-based
brain–computer interfaces for rehabilitation
application. In Bioengineering (Basel), vol. 9, no. 12, p.
768.
CAO, L., et al. (2021). A brain-actuated robotic arm system
using non-invasive hybrid brain–computer interface
and shared control strategy. In Journal of Neural
Engineering, 18(4), 046045.
Bousseta, R., El Ouakouak, I., Gharbi, M., & Regragui, F.
(2018). EEG based brain computer interface for
controlling a robot arm movement through thought. In
IRBM, vol. 39, no. 2, pp. 129–135.
HAYTA, Ü., et al. (2022). Optimizing Motor Imagery
Parameters for Robotic Arm Control by Brain-
Computer Interface. In Brain Sciences, 12(7), 833.
Arshad, J., Qaisar, A., Rehman, A. U., Shakir, M., Nazir,
M. K., Rehman, A. U., & Hamam, H. (2022). Intelligent
Control of Robotic Arm Using Brain Computer
Interface and Artificial Intelligence. In Applied
Sciences, 12(21), 10813.
Mu, Y., Zhang, Q., Hu, M., Wang, W., Ding, M., Jin, J., &
Luo, P. (2024). Embodiedgpt: Vision-language pre-
training via embodied chain of thought. In Advances in
Neural Information Processing Systems, 36.
Yurova, V. A., Velikoborets, G., & Vladyko, A. (2022).
Design and implementation of an anthropomorphic
robotic arm prosthesis. In Technologies, 10(5), 103.