[ieee 2014 ieee international instrumentation and measurement technology conference (i2mtc) -...

3
This research is supported in part by the National Science Foundation under grants number ECCS-1150507 and HRD-0802628 Abstract— Artificial limbs and exoskeletons have been widely used in a variety of applications, from military to medicine. DARPA primarily focuses on developing exoskeletons to aid ground soldiers in both physical performance skills and survivability. Robotic arms have been used to assist individuals who have lost the ability to perform everyday activities such as walking because of grievous medical injuries. In this paper we present design, implementation and evaluation of a mechanical arm controlled by commercially-off- the-shelf brain-computer interface (BCI) technology. The work examines the viability of incorporating the BCI technology to the control system of an exoskeleton or an artificial limb that serves as a rehabilitative tool for individuals to retrain their muscles. The Emotiv Software suite is used to recognize thought patterns and convert them into digital commands that Matlab interface communicates to an Arduino Uno processor which activates a particular motor to move the mechanical arm. Exhaustive simulations were performed to ascertain the performance of the BCI based system. Keywords—Brain Computer Interface; Mechanical Arm; Rehabilitative Tool; Emotiv Epoc; Electroencephelogram I. INTRODUCTION Artificial limbs are used in a wide variety of applications, from military to medicine. The use of external robotic limbs to replicate, augment or supplement human motion began as early as the1950’s [1]. Exoskeletons and artificial limbs were utilized by the military in a project initiated by the Defense Advanced Research Project Agency (DARPA) that focused on developing an exoskeleton to aid ground soldiers in both physical performance skills and survivability [2]. Medical use of artificial limbs focuses on replacing lost limbs or rehabilitating weakened ones; one such case was a robotic hand that was used as a rehabilitative tool for an individual [3]. Work conducted by a group that used an exoskeleton to assist individuals, who had lost the ability to perform everyday activities such as walking because of grievous physical injury, was presented in [4]. Brain Computer Interfaces (BCI) has been previously used to enable individuals to control exoskeletons that emulate their natural muscle movements [5]. BCI’s are relatively new technology that combines neurological hardware and pattern recognition software with the purpose of allowing individuals to communicate with a computer using their mind [6]. BCI hardware consists of neuroheadset with a varied number of sensors that measure Electroencephelogram (EEG) and Electromyography (EMG) signals; EEG is a recording of the electrical activity of the brain from the scalp whereas EMG a recording used to examine the electrical activity of the muscles present within the EEG data. The headset utilized for the experiments presented in this paper is the Emotiv EEG Neuroheadset (Fig. 1); it consists of 14 sensors that read data across 14 channels. In this paper we present an Event Related Desynchronisation (ERD) and Electromyography (EMG) based BCI controlled mechanical arm that will serve as a rehabilitative tool as it allows the user to operate the mechanical arm with their mind. Fig. 1. Emotiv EEG headset (a) Flexion and extension of elbow (b) Making and releasing a fist Fig. 2. Stand-alone mechanical arm with glove hand Design, Implementation and Evaluation of a Brain-Computer Interface Controlled Mechanical Arm for Rehabilitation Kiran George, Adrian Iniguez, Hayden Donze and Sheeba Kizhakkumthala California State University 800 N. State College Blvd. Fullerton, CA 92381, USA

Upload: sheeba

Post on 27-Mar-2017

214 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: [IEEE 2014 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) - Montevideo, Uruguay (2014.5.12-2014.5.15)] 2014 IEEE International Instrumentation and

This research is supported in part by the National Science Foundation under grants number ECCS-1150507 and HRD-0802628

Abstract— Artificial limbs and exoskeletons have been widely used in a variety of applications, from military to medicine. DARPA primarily focuses on developing exoskeletons to aid ground soldiers in both physical performance skills and survivability. Robotic arms have been used to assist individuals who have lost the ability to perform everyday activities such as walking because of grievous medical injuries. In this paper we present design, implementation and evaluation of a mechanical arm controlled by commercially-off-the-shelf brain-computer interface (BCI) technology. The work examines the viability of incorporating the BCI technology to the control system of an exoskeleton or an artificial limb that serves as a rehabilitative tool for individuals to retrain their muscles. The Emotiv Software suite is used to recognize thought patterns and convert them into digital commands that Matlab interface communicates to an Arduino Uno processor which activates a particular motor to move the mechanical arm. Exhaustive simulations were performed to ascertain the performance of the BCI based system.

Keywords—Brain Computer Interface; Mechanical Arm; Rehabilitative Tool; Emotiv Epoc; Electroencephelogram

I. INTRODUCTION

Artificial limbs are used in a wide variety of applications, from military to medicine. The use of external robotic limbs to replicate, augment or supplement human motion began as early as the1950’s [1]. Exoskeletons and artificial limbs were utilized by the military in a project initiated by the Defense Advanced Research Project Agency (DARPA) that focused on developing an exoskeleton to aid ground soldiers in both physical performance skills and survivability [2]. Medical use of artificial limbs focuses on replacing lost limbs or rehabilitating weakened ones; one such case was a robotic hand that was used as a rehabilitative tool for an individual [3]. Work conducted by a group that used an exoskeleton to assist individuals, who had lost the ability to perform everyday activities such as walking because of grievous physical injury, was presented in [4]. Brain Computer Interfaces (BCI) has been previously used to enable individuals to control exoskeletons that emulate their natural muscle movements [5].

BCI’s are relatively new technology that combines neurological hardware and pattern recognition software with the purpose of allowing individuals to communicate with a computer using their mind [6]. BCI hardware consists of neuroheadset with a varied number of sensors that measure

Electroencephelogram (EEG) and Electromyography (EMG) signals; EEG is a recording of the electrical activity of the brain from the scalp whereas EMG a recording used to examine the electrical activity of the muscles present within the EEG data. The headset utilized for the experiments presented in this paper is the Emotiv EEG Neuroheadset (Fig. 1); it consists of 14 sensors that read data across 14 channels.

In this paper we present an Event Related Desynchronisation (ERD) and Electromyography (EMG) based BCI controlled mechanical arm that will serve as a rehabilitative tool as it allows the user to operate the mechanical arm with their mind.

Fig. 1. Emotiv EEG headset

 

 

 

(a) Flexion and extension of elbow 

(b) Making and releasing a fist

Fig. 2. Stand-alone mechanical arm with glove hand

Design, Implementation and Evaluation of a Brain-Computer Interface Controlled Mechanical Arm for Rehabilitation

Kiran George, Adrian Iniguez, Hayden Donze and Sheeba Kizhakkumthala California State University 800 N. State College Blvd. Fullerton, CA 92381, USA

Page 2: [IEEE 2014 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) - Montevideo, Uruguay (2014.5.12-2014.5.15)] 2014 IEEE International Instrumentation and

II. DESIGN AND IMPLEMENTATION

A. Hardware and Software Utilized

Hardware used included the Emotiv EEG headset, Arduino Uno, and Arduino R3 Motor Shield. The Emotiv EEG headset consists of 14 saline sensors measuring electrical signals between 0 and 5 µVolts [6]; the sensors on the Neuroheadset are placed optimally to achieve the best spatial resolution. The data is transmitted to a computer using the Windows 7 operating system through a Bluetooth dongle.

The headset is paired with the Emotiv Software suite that consists of several programs. The programs utilized for the experiments include EmoKey and EmoControlPanel. The EmoControl Panel software detects variations in EMG and ERD signals; ERD’s are the average time-locked pattern linked to the desynchronisation of an alpha-rhythm [7]; EMG signals are present within the larger EEG signal set [6] and reflect the mechanical properties of muscles at rest and during contractions [8].

A mechanical arm (Fig. 2) was built in order to test the ERD and EMG based BCI control system. The arm consists of two 2.8 Amp stepper motors attached to the elbow and shoulder joints. There is one 5V continuous rotation servo attached to the frame. The stepper motors are responsible for movement of the forearm and upper arm, and the servo is responsible for closing and opening the glove hand.

B. ERD-EMG Pattern Recognition and Motor Control Process

The process of controlling the stand-alone mechanical arm begins with placing the Emotiv EEG headset on the user. The brain signal data is then collected and transmitted to a computer. EmoControl panel analyzes the data, and determines whether there are ERD or EMG patterns present. Once a pattern is classified the Emokey software then sends a corresponding keystroke based on the specific pattern detected. The keystroke is then sent to a Matlab GUI that consists of different buttons which generates specific commands within the Arduino for various motor movements. The different motor movements for the mechanical arm (fig.2) include opening and closing a hand, moving up and down the forearm, and moving up and down the upper arm. A Flowchart of the process is shown below in Fig. 3.

Fig. 3. EMG Pattern Recognition and Motor Control Process

III. RESULTS

Exhaustive simulations were performed to ascertain the performance of the system. Four movements were performed (Fig. 2): flexion of elbow, extension of elbow, making a fist, and releasing a fist. 500 measurements of response time (in seconds) per movement were conducted totaling in 2000 measurements. The average response time for each movement over all 500 trials are displayed below in Table 1.

In order to have a control data set, each motor movement was tested first without the BCI implementation and then the same process was executed with the BCI. Control of the Arduino was performed with the Matlab GUI shown in Fig. 4. The control data set and BCI implementation (identified with BCI) is shown in the first column of Table 1.

Fig. 4. Matlab GUI utilized for Control System

Table 1. Average Response times for mechanical arm movements

Action Average Time (seconds)

Flexion of elbow (w/ BCI) 0.0680 Flexion of elbow (w/o BCI) 0.081798311 Extension of elbow(w/ BCI) 0.0665

Extension of elbow(w/o BCI) 0.079476844 Making a fist (w/ BCI) 0.1931 Making a fist (w/o BCI) 0.20002534 Releasing a fist (w/ BCI) 0.0338 Releasing a fist (w/o BCI) 0.03487447

Average Time With BCI 

.0904 

Average Time Without BCI 

.0990 

Percent Difference 

9.18% 

It can be observed that average time for movements using the BCI based controlled system was 9.18% less compared to that of the non-BCI based controlled system. This is significant and demonstrates the fact that BCI based control is a viable choice for artificial limbs or exoskeletons that are currently used for rehabilitation.

Measurements were taken by a flexiforce sensor over a 12 second period taken to close the hand. The average value of

Page 3: [IEEE 2014 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) - Montevideo, Uruguay (2014.5.12-2014.5.15)] 2014 IEEE International Instrumentation and

the force was 1.0585 Newtons, with a peak of 8.4840 Newtons; this is within the ideal tension in the wire that would control movement in the robotic hand [8].

IV. CONCLUSION

Design, implementation and evaluation of a mechanical arm controlled by commercially-off-the-shelf brain-computer interface (BCI) technology were presented. The work examined the viability of incorporating the BCI technology to the control system of an exoskeleton or an artificial limb that serves as a rehabilitative tool for individuals to retrain their muscles. Experimental results showed that the average time for mechanical movements using the BCI based controlled system was 9.18% less compared to that of the non-BCI based controlled system; this significant result demonstrates that fact that BCI based control is a viable choice for artificial limbs or exoskeletons that are currently used for rehabilitation.

V. FUTURE WORK

Further work will include eliminating the Emotiv Software, thus increasing the speed at which the control system will perform. Eliminating the Emotiv software will mean applying the SSVEP paradigm, which will allow a user more natural control. Future work also includes determining the effectiveness of the proposed system as a rehabilitation assistant by measuring the required human force to activate the motors. Included will be the construction of a new arm attachment that will have more control i.e. control for finger, wrist.

VI. REFERENCES

[1] E. Garcia, J. M. Sater and J. Main, "Exoskeleton for Human

performance augmentation (EHPA): A Program Summary," J.Robot Soc., Japan, 2002.

[2] J. Iqbal, N. Tsagarakis, D. Caldwell, "A Multi-DOF Robotic Exoskeleton Interface for Hand Motion Assistance," in 33rd Annual International Conference of the IEEE EMBS, Boston, Massachesetts, 2011.

[3] S. Murray, "Towards the Use of a Lower Limb Exoskeleton for Locomotion Assistance in Individuals with Neuromuscular Locomotor Deficits," in 34th Annual International Conference of the IEEE EMBS, San Diego , Ca, 2012.

[4] J. L. Contreras-Vidal, A. Presacco, H. Agashe, and A. Paek, "Restoration of Whole Body Movement," IEEE Pulse, pp. 34-37, January/February 2012.

[5] M. Lang and T. Mitrovic, “Investigating the Emotiv EPOC for cognitive control in limited training time.” B.S. Thesis, University of Canterbury, New Zealand, 2012.

[6] J. Sergeant, R. Geuze, and W. Winsum, "Event-Related Desynchronization and P300," vol. 24, no. 3, pp. 272-277, 187.

[7] R. B. Reilly and T.C. Lee, "Electrograms (ECG,EEG,EMG,EOG),"Dublin: IOS Press, 2010.

[8] U. Jeong, H-K. In and K-J. Cho, "Implementation of various control algorithms for hand rehabilitation exercise using wearable robotic hand," Springer, 2013.