Abstract

More than one billion people face disabilities worldwide, according to the World Health Organization (WHO). In Sri Lanka, there are thousands of people suffering from a variety of disabilities, especially hand disabilities, due to the civil war in the country. The Ministry of Health of Sri Lanka reports that by 2025, the number of people with disabilities in Sri Lanka will grow by 24.2%. In the field of robotics, new technologies for handicapped people are now being built to make their lives simple and effective. The aim of this research is to develop a 3-finger anatomical robot hand model for handicapped people and control (flexion and extension) the robot hand using motor imagery. Eight EEG electrodes were used to extract EEG signals from the primary motor cortex. Data collection and testing were performed for a period of 42 s timespan. According to the test results, eight EEG electrodes were sufficient to acquire the motor imagery for flexion and extension of finger movements. The overall accuracy of the experiments was found at 89.34% (mean = 22.32) at the 0.894 precision. We also observed that the proposed design provided promising results for the performance of the task (grab, hold, and release activities) of hand-disabled persons.

1. Introduction

Due to population aging and the country’s civil war, the prevalence of disability in Sri Lanka rose between 1981 and 2001 [1, 2]. In the 2001 census, it was estimated that 0.3 million people had a disability. Disabilities related to vision, hearing, speaking, hands, legs, and other physical and mental health disabilities have the highest prevalence estimates per 10,000 populations [3]. According to recent studies, 7% of Sri Lanka’s total population (approximately 1.4 million people) suffers from some form of disability [4]. The respondents in this research were primarily those with hand disabilities.

Robotics technology is rapidly evolving in order to make people’s lives easier and more efficient. In this field, various machines and robot designs are manufactured to assist handicapped people who suffer from disabilities such as blindness, broken legs or arms, and dislocated body parts. Many robotic arms are being designed for the purpose of providing care to physically disabled people [5], and some have already been commercialized [6]. Recent advancements in neural prosthetics have the ability to help disabled people to regain control of their motor functions and speech [7, 8].

Brain-computer interface (BCI) is known as a multidisciplinary domain involving subjects such as neuroscience, digital signal processing, and machine learning [9]. BCI is a rapidly growing technology that assists disabled people [10]. BCIs allow capturing brain signals with the use of skin electrodes and they are appealing in prosthetic device control such as robot arms. Most traditional methods of BCI include motor imagery, P300, and Steady-State Visual Evoked Potentials (SSVEP) [11, 12]. Biosignal-based control systems are the next step to achieve more accuracy and can be classified in different ways [13]. Biosignals or bioelectrical time signals are referred to as biomedical signals which represent collective electrical and mechanical signals obtained from organs in the human body. The best-known bioelectrical signals are electroencephalography (EEG), electrocardiography (ECG), electromyography (EMG), mechanomyography (MMG), electrooculography (EOG), galvanic skin response (GSR), and magnetoencephalogram (MEG).

Most of the robot designs introduced are not based on the human body’s anatomical behavior. It is a challenge to design a robot hand that acts as an anatomy of the human hand. This paper provides a scientific approach to the development of an anatomical robot hand model for hand-disabled people. The objectives of this research are as follows:(1)To find a possible solution for the development of a 3-finger robot hand based on the anatomy of the human hand.(2)To acquire and process the EEG signals related to hand movements.(3)To control the robot hand model according to the motor imagery.

3D printing technology was used for the physical development of the robot hand model. For the finger design, the anatomy of the thumb, index, and middle fingers was considered. The finger flexion and the extension movements were inspired by the extensor tendons of the human hand. The theoretical background of the research is carried out in Section 2. Section 3 describes the related works in the BCI field. Section 4 explains the design and implementation of the proposed system. Section 5 demonstrates the test results of the proposed system, and Section 6 discusses the outcomes and limitations of the research. Finally, Section 7 concludes the research and proposes some future works.

2. Theoretical Background

2.1. Anatomy of the Human Hand

The human hand (Figure 1) has a complex anatomical structure consisting of bones, muscles, tendons, skin, and the complex relationships between them [15, 16]. Human hand composed of 27 bones, arranged in five serial kinematic chains to form the fingers. The fingers are numbered as follows:(1)Thumb finger(2)Index finger(3)Middle finger(4)Ring finger(5)Little finger

Each finger (2–5) consists of a metacarpal bone located in the hand and three phalanges named the proximal, medial, and distal phalange (in the order from finger base to fingertip). The thumb only consists of proximal and distal phalanges; it does not have a medial phalange. The remaining eight hand bones are the carpals that are located in the wrist [17]. The name of each joint is based on the bones they linked. In the fingers, the two interphalangeal (IP) joints are distinguished by the prefixes as follows:(1)DIP (distal interphalangeal)(2)PIP (proximal interphalangeal)(3)MCP (metacarpal phalanx)

Fingers movement’s action starts from pyramidal and nonpyramidal cells in the motor cortex. Pyramidal cells as the major output neurons send long axons down the spinal cord. Primary motor cortex neurons fire 5–100 ms before the onset of a movement [18]. Finger movements of the human hand [19] are composed of three actions as follows:(1)Flexion finger movement(2)Extension finger movement(3)Idle finger movement

The anatomy of the human hand was considered for the development of the thumb, index, and middle fingers of the robot hand model. Natural finger flexion and extension are performed by the linearly coupled movements among the metacarpal phalanx (MCP), proximal interphalangeal (PIP), and distal interphalangeal (DIP) joints [20, 21]. Table 1 depicts the different motion of angles for DIP, PIP, and MCP of the human hand.

The first objective of this research was to the development of a robotic hand model that is functioning as a human hand. The anatomical behavior explained here referred to the development of the finger (thumb, index, and middle) structures. The working principle of the extensor tendons was studied to find a solution for the working mechanism of flexion and extension of robot fingers. The bending angles of each robot finger joint were contemplated by using the real bending angles of the human fingers. The method for motor control has therefore been developed according to the bending of each joint.

2.2. Electroencephalography Signals and International 10–20 Electrode Placement Protocol

The brain-computer interface (BCI) is a system that facilitates communication between the brain and the machine [8]. The BCI device is capable of recording, interpreting, and generating corresponding commands on the connected computer to perform its purpose. The functions of the typical BCI system are based on the sequential execution of several procedures, such as the acquisition of signals, preprocessing, and extraction of features, classification, translation, and feedback to the controller. In this research, we were primarily focused on the acquisition of EEG signals in the BCI signal acquisition process (EEG-BCI).

Electroencephalography (EEG) is a method used to measure the electrical activity of the human brain. EEG uses surface electrodes to measure the electrical signals of the human brain [22]. The International Federation of Clinical Neurophysiology (IFCN) adopted the standard method for EEG electrode placement known as 10–20 electrode placement protocol [23, 24]. The international 10–20 protocol standardized the physical placements and designations of 21 electrodes on the human scalp. Figure 2 illustrates the placement locations of the electrodes from the side view and plan view. Using the reference points on the skull in the Nasion, preauricular points, and Inion, the head is divided into proportional positions to provide adequate coverage of all the human brain regions [26].

The most well-known rhythmic activity generated from the human brain is the alpha waves (α) (8–13 Hz) which generates during wakeful relaxation [27]. The theta rhythms (θ) (4–7 Hz) are associated with memory processing when it appears in the frontal cortex [28] and spatial navigation when in the parietal cortex. Previous studies discovered that neurophysiological phenomena called event-related desynchronization (ERD) or synchronization (ERS) are detectable from EEG signals when motor imagery is performed [29]. ERD or ERS is also a high-frequency band-specific [30] but can be observed from mu rhythms (µ) (8–12 Hz) or beta rhythms (ß) (13–30 Hz) of the EEG signals [31]. The amplitude of the EEG rhythms is about 100 µV when measured on the scalp and about 1–2 mV when measured from the surface of the human brain. Motor imagery features [32] of the EEG signals appear in the frequency range of 6–33 Hz. The artifacts caused by the transmission lines lie within the range of 50–60 Hz and the eye artifacts within a frequency of 2–5 Hz of EEG data.

Preprocessing is a technique that is done to minimize signal noise and to add some filtering and other measures to eliminate errors that are caused by endogenous sources (eye, muscle, and heart) and exogenous sources (power-line coupling and impedance mismatch) [33]. Preprocessing is typically achieved by low-pass, high-pass, band-pass, or notch filtering. The use of such filters may, however, exclude useful elements of EEG signals having the same frequency band as the artifacts.

The second objective of this research was the acquisition and processing of EEG signals related to finger movements. EEG acquisition and electrode positioning techniques are discussed here and have been used to meet the research objectives. Since hand and finger movements are related to the primary motor cortex of the human brain, the 8-electrode EEG helmet was chosen to be worn to cover the primary motor cortex region. Electrode placements (FC3, FC4, C1, C2, C3, C4, Cz, and CPz) were suggested according to the international 10–20 electrode placement protocol. The EEG preprocessing performed in the design section (Section 4) of this paper was carried out based on the theoretical background of the signal behavior discussed here.

Using noninvasive EEG, Xiao and Ding [34] have evaluated multiple movement-related features under the same task that is distinguishing individual fingers from single hand. They have used 128 EEG channels, and decoded individual fingers with noninvasive EEG that has the potential to increase the number of control features, allowing for the advancement of more sophisticated noninvasive BCI applications.

Javed et al. [35] have proposed a new approach for classifying four-finger motions of the right hand based on EEG data. They have used a 14 channel electrode headset to acquire the EEG signals. The gathered EEG signals have initially filtered to preserve the alpha and beta bands, which provide the most detail about movement.

A BCI system suggested by Gannouni et al. [36] has distinguished differences between the five individual fingers. As a result, a multiclassification problem based on an ensemble of one class-classifier has been implemented, with each classifier predicting the intention to move one finger.

Alazrai et al. [37] have suggested an EEG-based BCI method for detecting finger motions, including the flexion and extension movements of the index, middle, ring, and little fingers, as well as four thumb-related movements, including thumb adduction, thumb abduction, thumb flexion, and thumb extension.

Using electroencephalography, Ketenci and Kayikcioglu [38] have investigated the effect of theta brainwave on movement identification in four right-handed participants who conducted extensions with their right hand fingers (EEG). Muscle signals have been used to derive movement and rest epochs from a continuous EEG recording. The common average and Laplacian reference methods have been used to pick and reference channels have located over the sensorimotor region. The presence of the theta band in the frequency domain has been shown using the power spectral density function.

We have referred most recent research articles related to the EEG and BCI through Google Scholar. With systematic review, we have gained knowledge regarding the development of our robot hand model using the BCI technologies. Table 2 depicts the recent studies and the number of electrodes used in works.

4. Materials and Methods

The objectives of this research were to find a possible solution for the development of a three-finger robot hand model that works the same as the anatomy of the human hand, acquire and process the EEG signals related to the hand movements, and control the robot hand model according to the motor imagery. This section introduces the design and development of each objective. As the functions explained in the theoretical context of this paper, we developed the EEG helmet as the first step.

4.1. Design and Development of EEG Helmet

The acquisition of the EEG signal, which relates to the mental execution of hand movements, is associated with the primary motor cortex of the human brain. Electrodes are therefore required to be installed to cover the motor cortex region of the human head. In this research, we used Ag/AgCl coated 8-channel electrode caps (d= 10 mm). Experiments were carried out by adopting eight EEG electrodes attached to a fabric helmet according to the international 10–20 electrode placement protocol for FC3, FC4, C1, C2, C3, C4, Cz, and CPz. Table 3 depicts the coordination for EEG electrodes on the human scalp according to the 10–20 electrode placement protocol: where : inclination angle, : azimuth angle, : radius of the subject’s head, : positive is the direction of the neck, : positive is the direction of the right ear, and : positive is the direction of the sky.

Equations (1)–(3), respectively, represent the x-direction, y-direction, and z-direction of the subject’s head. Figure 3 shows the physical view of the EEG electrode helmet after wearing it to the subject’s head while performing the experiments.

4.2. Hardware Selection for the EEG Acquisition

The hardware selection for the acquisition of EEG was a key objective in the design section. OpenBCI specializes in developing low-cost, high-quality biosensing hardware for brain-computer interfacing. An OpenBCI [41] printed circuit board (PCB) is equipped with sensors to detect and measure electrical activities in the brain (EEG), muscles (EMG), and heart (EKG). In this research, EEG signals were captured using the OpenBCI 8-channel Cyton biosensing module and transferred to the EEG data analyzing interface (EEGDAI) for signal processing and classification. Furthermore, EEGDAI was analyzed data according to the motor imagery, and controlled signals were passed towards the Arduino microcontroller. For this research, an Arduino ATmega 2560 microcontroller was used [42, 43]. The next step was to design and develop the EEG data analyzing interface (EEGDAI).

4.3. Development of the EEG Data Analyzing Interface (EEGDAI)

The EEG data analyzing interface (EEGDAI) was developed to acquire EEG signals, filter, and analyze received signals. EEGDAI (Figure 4) was developed using the MATLAB GUI [44] Application tool and each filtering process was performed by implementing the GUI callback functions. A popular tool for BCI system design is BCILAB [45], an Open-source MATLAB toolbox, and EEGLAB plugin developed by C. Kothe at the Swartz Center. BCILAB contains many EEG signal processing methods and includes a graphical user interface to aid in the development of the EEGDAI system.

4.4. EEG Preprocessing and Classification for Motor Imagery

Signal artifacts [46] are more significant while collecting EEG data from the data acquisition process. Artifacts are known as unwanted signals which originate from environmental noise, experimental error, and physiological artifacts [47]. Many techniques were developed in both the time and frequency domain for correcting or removing the artifacts from EEG rhythms [48, 49]. There is also evidence of physiological artifacts, which is bioelectrical signals from other parts of the human body such as heart, muscle activity, eye blink, and eyeball movement that are registered in the EEG rhythms [50, 51].

The surface Laplacian algorithm (SLA) was used for the spatial filtering process; the SLA smoothed the signal and reduced the artifacts caused by ocular artifacts, cardiac artifacts, and power-line interferences. The features of motor imagery [32] in EEG signals are appearing in the 6–33 Hz frequency range. The mu (µ) (8–12 Hz) and beta (ß) (13–30 Hz) rhythms were used to distinguish EEG signals [52] that are related to motor imagery tasks (hand movements). Artifacts caused by the transmission lines were laid within the 50–60 Hz range, and the eye artifacts with the frequency of 2–5 Hz range were found in the recorded EEG data. Therefore, signals were filtered by using the “Band-pass” filtering with the band-pass of 6–35 Hz to eliminate artifacts caused by transmission lines and eyes. Figure 5 illustrates the flow of the signal processing and classification process of the proposed design.

After EEG acquisition was completed, the “Artifact remove” process was performed to eliminate the artifacts from the original source. “Band-pass” filtering process was used to filter the mu and beta rhythms that were received from the “Artifact remove” process. Then, mu (µ) and beta (ß) rhythms were sent to the “Feature extraction” process to extract and identify the hand movements. The classified EEG data from the “Feature extraction” process were passed to the decision controller, which was used to generate the control decisions for the robot hand prototype. The decision controller and the hand model were connected with the controlling unit of the robot hand.

One of the major research objectives of this research was the acquisition and processing of EEG related to hand movements. Design and development of EEG helmet, hardware selection for the EEG acquisition, development of the EEG data analyzing interface, and EEG preprocessing and classification for the motor imagery were the steps taken to achieve the abovementioned objective.

4.5. Development of the Robot Hand Model and Decision Controller

The main objectives of this section were to develop a robot hand model based on the anatomical behavior of the human hand and develop a controller to make decisions for manipulating the robot hand model. In the development of the robot hand model, the bending angles of each interphalangeal joint of the human hand and the operating theory of the extensor tendons were taken into account.

3D printing is a new wave of technological advancement in the field of architecture, design, and manufacturing [53]. The MakerBot Replicator Z18 3D printer device was used for the development of each part of the robot hand model. Each finger was developed by referring to the anatomy of the human hand. The robot hand model consisted of thumb, index, and middle fingers. The index and middle fingers were developed with DIP, PIP, and MCP joints. The thumb finger was constructed with the DIP and MCP joints. The working mechanism of the robot hand model was developed using the thread and a mechanical system. Each speed-reduction motor (GA12-N20) was attached to a plastic thread by using a pulley. When the motors were rotated, the threads were subjected to tension force or compression force. A stainless steel bearing (6 mm) was used to overcome the friction between the finger joints (DIP and PIP). When the motors were rotating clockwise, fingers were subjected to flexion movement and when the motors were rotating anticlockwise, fingers acted as the extension movement. Figure 6 shows the architecture of the fingers of the proposed hand design.

In this research, plastic threads were used as extensor tendons and speed-reduction motors were used for tension and release of the plastic threads. Each joint was attached to a bearing in order to minimize the friction between the joints. Figure 7 illustrates the prototype of the robot hand model that has been developed. As shown in the figure, the plastic threads were driven through the robot’s fingers. The different rotation angles of the motors were chosen for the different bending angles of the robot hand. The specifications of the robot hand model are described in Table 4. According to the table, the bending angles of the robot fingers were observed in the same way as the actual bending angle of the human fingers.

The next step was the development of the robot controller. Figure 8 illustrates the system development block diagram for the robot hand controller.

EEG electrodes that connected to the OpenBCI module were used to extract the EEG signals. After signals were extracted, a feature extraction process was performed.

Feature extraction process: the common spatial pattern (CSP) technique, which is a well-known feature extraction technique, was used to extract appropriate features from eight EEG signals. These features reflect the most significant energy at the related electrodes in the mu (µ) and beta (ß) bands, which are most likely to contain significant motor imagery data.

Classification process: the support vector machine (SVM) is a classic approach for pattern recognition in the BCI system that uses the optimal discriminant hyperplane to distinguish groups, and it was used to classify four types of patterns in this research. We used OpenVibe Classifier Trainer for the classification process.

Decision controlling process: the robot controller was composed of Arduino ATmega 2560 microcontroller and L298N motor drivers. The direction of the motor turn was controlled using the IN1, IN2, IN3, and IN4 pins of the motor drive unit. The duty cycle of each motor was adjusted by setting the PWM (pulse-width modulation). Table 5 depicts the truth table for the function of the motor for rotating in the clockwise and anticlockwise direction. Three motors were used according to the truth table for manipulating each finger.

The schematic diagram of the proposed system is shown in Figure 9. As shown in the diagram, the system consists of a microcontroller unit (U1), motor driver unit (U2 and U3), BCI unit (U4), EEGDAI platform (U5), and DC motor unit (M1, M2, and M3). The OpenBCI module passed the electrical signals to the EEGDAI, and it was classified signals by finger movements. The motor control signals according to the signal classification process were generated by the decision controller. Then, M1, M2, and M3 motors were driven according to the motor imagery.

5. Results

The results of this research were based on the EEG data recorded by healthy subjects to observe wave behavior based on motor imagery. EEG data were collected at 256 Hz sampling frequency with 32 sample counters per buffer using an Ag/AgCl coated 8-channel electrode. We observed that electrodes had a minimum mean impedance of 5.28 kΩ and a maximum mean impedance of 5.81 k. This corresponds to a mean precision of 0.89 since we expected the mean impedance to be 5 kΩ. Five channels for mu-rhythms and eight channels for beta rhythms were extracted from the original data to determine the activity of the motor imagery. The mu-rhythms are the range of 8–12 Hz EEG oscillations reported from the scalp electrodes corresponding to the brain sensorimotor region (C1, C2, C3, C4, and Cz). The beta rhythms ranged from 13 to 30 Hz, and wave signals were extracted from the original data related to the primary motor cortex region of the brain (FC3, FC4, C1, C2, C3, C4, Cz, and CPZ).

For the testing, 27 healthy subjects were selected from different age groups and genders. All subjects were new to BCI uses and found no illness related to brain activities. Each test subject completed an experiment in 42 s time duration. A single experiment was performed in five test activities (resting, idle, flexion, hold flexion, and extension). The activities and actions of the test subject during the experiment are described in Table 6.

Each test activity of the experiment was performed as follows:

Test 1.  Here, the “rest” activity was performed for 6 s of time duration. In the “rest” state, the subject did not perform any activity such as flexion, extension, or idle. The mu and beta rhythms formed in the first step were very small, and in some test subjects, it only contained signals while relaxing. Figure 10 illustrates the processed mu and beta waveforms during the experiment.

Test 2. In the second experiment, the subject was in the “idle” state for 7 s of time duration, and the subject has not performed either flexion or extension. Here, the processed output waveform (Figure 11) was the same as in the resting state, but the averaged mu and beta rhythms were considerably larger.

Test 3. This experiment was considered as the “flexion” state. Flexion movements of the subject’s fingers were performed for 7 s of time duration. The output signal of the mu and beta rhythms were varied as shown in Figure 12. The mu and beta rhythms on the graph show the variation between idle and flexion movements.
At the flexion movement, the mu and beta rhythms had a varied behavior. Sudden EEG variation caused high power detection. Higher power is denoted by red color in 2D topography. Figure 13 shows the ERD/ERS topographical view of the mu and beta frequencies during flexion activity. As shown on 2D topography, higher power was detected on the primary motor cortex region of the brain, in which motor imagery is associated with finger movements.

Test 4. In this step, the subject performed a “Hold” activity for the 15 s time duration. All the fingers on the subject’s hand were fully flexed and were concentrated to keep finger flexion. In the fourth experiment, the mu and beta rhythms were abnormally varied on some subjects due to the instability of concentration when performing the task.

Test 5. In the “extension” state, the subject carried out an extension movement for 7 s of time duration. Figure 14 illustrates the waveform generated during the “extension” activity, and Figure 15 demonstrates the ERD/ERS topographical view of the mu and beta frequencies while performing the activity. When the subject performed in “extension” activity, high power concentration was reflected. As shown in the figure, all electrodes were detected with high power in the central lobe region. This change was detected due to the high concentration after performing the hold flexion.
Table 7 depicts the observation of the controller output signals which were attached to the robot hand model. In the first, second, and fourth experiments, the motors were acting as the “Low” mode, in which motors were not performed in a rotated either clockwise or anticlockwise direction. In the third experiment, fingers of the robot hand model were flexed due to the motors rotating in a clockwise direction. In the fifth experiment, motors were rotated in an anticlockwise direction; therefore, robot hand fingers were performed on extension motion.
The output signal of the motor control is shown in Figure 16. As shown in the image, a positive high represents the clockwise direction of the hand model. The negative high demonstrates the anticlockwise direction of the motors.
Table 8 depicts the experiment results for 27 test subjects (male: 52%, female: 48%) on five experiments. We used four age groups (15–30, 31–40, 41–55, and >55) for the experiment. Figure 17 depicts the average accuracy for both male and female test subjects based on the age categories in Table 8. According to the results, higher accuracy shows for both male and female test subjects in the 15–30 age group. The age group above 55 showed lower accuracy due to the weak mu and beta rhythm power during the EEG acquisitions. As the test subjects grew older, we observed that the system’s accuracy dropped.
Figure 18 illustrates the confusion matrix of the experiment results [54]. Equations (4) and (5) [55] describe the accuracy and precision of test results that derived from the confusion matrix: where TN is the number of correct predictions of a negative case, TP is the number of correct predictions of a positive case, FP is the number of incorrect predictions of a positive case, and FN is the number of incorrect predictions of a negative case.
The overall accuracy of the experiment was found 89.34% and the precision at 0.894. We have observed that test 4 was at the lowest mean value (m= 20.8) due to a lack of mental focus during the experiment.

6. Discussion

Due to the civil war of the country and injuries, there has been a lack of innovations in Sri Lanka, making it difficult to find a solution for handicapped people. In this research paper, we designed and developed a possible solution of an anatomical robot hand model that is functioning by eight EEG channels. We found that the majority of studies were focused on at least 13 EEG channels after conducting a comprehensive review of recent studies led by other researchers. We used eight EEG electrodes in this research (FC3, FC4, C1, C2, C3, C4, Cz, and CPz), which cover the most effective region of the primary motor cortex of the human brain. Experiment results suggested that the proposed design worked at 89.34% accuracy. Therefore, controlling an anatomical structured robot hand model is possible with eight EEG channels. We also observed that it is advantageous since the use of fewer EEG channels means less expense.

A three-finger (thumb, index, and middle) robot hand model was developed in this research. However, a five-finger model was needed to act like a real human hand. It was also essential to use EEG signals to control each finger based on the user’s desires. According to the observations, eight EEG channels were adequate to control the flexion and extension of robot fingers. Eight EEG signals, on the other hand, do not have enough spatial resolution to be used to control individual finger movements. Therefore, complex movements like a real human hand will have to be modified with this practice. Eight EEG electrodes are insufficient to cover the entire primary motor cortex and sensorimotor region of the brain when designing complex BCI systems. In real life, the hold flexion motion is essential for holding objects. In this research, we discovered that when performing test 4, the mean value is lower (m= 20.8) than in other experiments, which is a disadvantage.

The authors’ next step is to design a robot hand model to control each finger joint (DIP, PIP, and MCP) according to the motor imagery. The next proposed design model is to develop a five-finger robot hand that has the same anatomical structure and behavior as a human hand. We also expected to put the results to the test with real-world objects.

7. Conclusions

In this research, we proposed a possible BCI solution for controlling a robot hand. The proposed hand model was developed using the anatomical behavior of the human hand. The physical design of the robot hand model was developed using 3D printing technology. The mechanism of the finger flexion and extension was achieved with the aid of the working principle of the extensor tendons. Therefore, the fingers were driven using a thread mechanism to perform the flexion and extension movements. EEG acquisition was performed and controlled the speed-reduction motors according to the motor imagery.

The experiment was carried out for 42 s of the time period. The proposed system was observed to be working at 89.34% accuracy (precision = 0.894). According to the test results, eight EEG channels were sufficient to acquire the motor imagery for flexion and extension movements of human fingers. This proposed design provided promising results for the performance of the task (grab, hold, and release activities) of hand-disabled persons.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request. The data are not publicly available because the research is part of an ongoing project.

Conflicts of Interest

The authors declare that they have no conflicts of interest.