smart wheelchair - emotiv-easycap-openvibe - final report - 6-14-16

13
OPENViBE Signal Processing of Emotiv-EasyCap System for Hand motor imagery Scenario Study Page 1 of 13 Table of Contents 1 Abstract 2 Background 3 Methods and Results 5 Discussion 6 References Prepared by: Sina Dabiri, Research Associate, ECE 6/14/2016 Date Reviewed by: C. T. Lin, Professor, ME Date

Upload: sina-dabiri

Post on 15-Apr-2017

79 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 1 of 13

Table of Contents

1 Abstract

2 Background

3 Methods and Results

5 Discussion

6 References

Prepared by:

Sina Dabiri, Research Associate, ECE

6/14/2016

Date

Reviewed by:

C. T. Lin, Professor, ME

Date

Page 2: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 2 of 13

1 Abstract

The purpose of this study is to use OPENViBE, an open source signal processing software, for

the Emotiv-EasyCap EEG system. In the previous study the Emotiv microprocessor was

combined with EasyCap EEG cap and electrodes. The Motor Imagery BCI with Common

Spatial Filter for hand motor imagery programs included in the software is used for cognitive

command training. Six subjects were trained to execute left and right EEG commands with the

Emotiv-EasyCap system and OPENViBE software. All trained subjects were successful on two

cognitive commands with 65% and higher accuracy.

2 Background

The engineering laboratory of Dr. C.T. Lin has been working to build a smart wheelchair

with cognitive control by using the Emotiv EPOC headset. In the previous study, the Emotiv

electric chip and microprocessor with EasyCap electrodes and headcap to get the flexibility to

move the electrodes to locations specific for motor commands, Figure 2 [1]. Yet the Emotiv

SDK control panel software was not sufficient in enabling us to train subjects to execute two

cognitive commands. Therefore, in this experiment the objective is to learn to use the

OPENViBE software with the Emotiv-EasyCap system to process and analyze the signal for

executing two cognitive commands.

Figure 1: EasyCap’s head cap [1], and OPENViBE software for signal

processing.

Page 3: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 3 of 13

The software has a scenario called Motor Imagery BCI with Common Spatial Filter, which

is a motor imagery scenario and uses the signal to close the left hand to choose the left of the

screen and closing the right hand to choose the right, Figure 3 & 4 [2].

3 Methods and Results

The experiment was conducted in Dr. Lin’s laboratory. The first subject had 5 training

sessions, the second subject 3 training sessions, the third subject 2 training sessions, the fourth

subject 4 training sessions, the fifth subject 3 training sessions and the sixth subject had 4

training sessions. The subjects were able to execute the two commands with above 65%

accuracy even with one training session.

Before the start of the experiment we have a 10-minute meditation period where subjects

will focus on their breathing and try to gently bring their wondering mind back to their breath.

The purpose of this meditation is to quite the brain and calm the emotional state of the subject.

This will reduce the background noise in their mind while making cognitive commands. Then

follows the training session for one hour. Noise blocking headset is used to minimize

background noise and distractions (3M Professional Earmuff, NRR 30 dB).

The 14 electrodes channels were placed on the EasyCap’s cap as they are outlined in Figure

3. The location of the channels is based on the brain’s hand motor cortex location. In addition,

they are chosen based on OPENViBE’s recommendation for handball scenario [2]. Electrode

connectivity is confirmed by Emotiv Control panel SDK software.

Page 4: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 4 of 13

Figure 2: The 14 electrode positions on the left are the handball motor imagery setup. The

right image is Emotiv headset’s original electrode position [3]. The ground electrode, channel 2,

is positioned on the nose. The brain’s reference electrode, channel 1, is on Fpz. The

abbreviation for the electrode locations are C: Central; F: Frontal; P: Parietal; O: Occipital; T:

Temporal. This is a bird-eye view of the top of the brain, and the nose is at the top (Nasion).

The Inion is a bumpy bone piece it the back of the head.

OPENViBE acquisition server was used to get the raw EEG signal and connect to the

Emotiv EPOC SDK control panel software. The OPENViBE design software’s motor imagery

with Common Spatial Pattern filter scenario was used for training. This consists of 5

subprograms:

3.1 mi-csp-0-signal-monitoring.xml: This program was running in the background to check the

signal quality of our set up.

3.2 mi-csp-1-acquisition.xml: This program acquires data to train the classifier that will

discriminate left and right hand movements. There is 20 left and 20 right commands that

show up in random doing the training. Each training takes 8 minutes.

Page 5: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 5 of 13

Figure 3: The training window shows a black screen to the subjects, 1

seconds before the command comes a red plus displays, and then comes the left or right

arrow. The command stays on screen for 1.25 seconds.

3.3 mi-csp-2-train-csp.xml: This program computes the common spatial pattern to create a

filter that maximizes the difference between the signal of the two commands.

3.4 mi-csp-3-classifier-trainer.xml: This program trains a Linear Discriminant Analysis (LDA)

classifier based on the previous acquisition session. LDA is a statistical method that in this

experiment helps us find a linear discrimination of features that separates the two commands

from each other (Table 1).

Page 6: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 6 of 13

Table 1: The results from the first training of the subjects.

3.5 mi-csp-4-online.xml: This program is the second training and adds real-time feedback to

the visualization during training, using the trained LDA classifier. The feedback looks like a

Date Session # Left Right Left Right

Subject 1

10/30/2015 1 70.9 71.6 71.0 71.5

11/6/2015 2 80.8 74.4 80.4 74.5

11/7/2015 3 64.5 65.4 65.8 65.9

11/28/2015 4 74.2 75.8 74.7 75.4

1/8/2016 5 76.1 77.0 75.5 77.1

Subject 2

12/16/2015 1 76.4 77.2 76.4 77.3

1/11/2016 2 75.2 84.6 75.2 85.3

1/14/2016 3 77.8 84.3 77.9 84.4

Subject 3

1/6/2016 1 80.8 73.5 80.7 73.6

1/13/2016 2 67.7 65.9 68.3 67.0

Subject 4

3/11/2016 1 71.3 65.2 71.4 65.9

3/18/2016 2 73.1 80.3 74.0 80.8

4/1/2016 3 77.9 78.9 77.9 78.7

4/15/2016 4 76.2 81.6 76.3 81.7

Subject 5

3/15/2016 1 73.8 65.4 74.8 65.4

3/22/2016 2 73.6 67.0 74.3 67.7

4/14/2016 3 77.7 74.2 78.1 75.1

Subject 6

3/16/2016 1 71.4 74.7 72.4 75.7

4/1/2016 2 71.9 75.3 72.7 75.5

4/8/2016 3 72.9 78.0 72.7 77.7

4/15/2016 4 66.3 69.1 65.9 69.5

1st Training

Cross

Validation

Test

Accuracy

Training

Set

Accuracy

Page 7: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 7 of 13

bar that it lengthens as the strength of the signal changes and shows what the program is

interpreting the subject is trying to do (Table 2).

Figure 4: The feedback after the command arrow is presented is displayed as

a blue bar where the length of it shows the strength of the command the program is

interpreting. The feedback stays on screen for 3.75 seconds.

Page 8: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 8 of 13

Table 2: The results from the second training of the subjects.

3.6 mi-csp-5-replay.xml: This program is based on the online session recorded in the previous

step. The program displays the confusion matrix of the classifier and its global performance

during the session (Table 3).

Date Session # Left Right Left Right

11/28/2015 4 97.7 96.4 97.6 96.5

1/8/2016 5 93.6 92.4 93.5 92.8

Subject 2

12/16/2015 1 error error error error

1/11/2016 2 87.7 84.7 87.3 85.1

1/14/2016 3 88.5 96.0 89.1 96.3

Subject 3

1/6/2016 1 81.3 78.8 81.2 78.9

1/13/2016 2 94.5 86.2 94.3 86.7

Subject 4

3/11/2016 1 82.3 83.8 82.6 84.2

3/18/2016 2 78.3 80.8 78.4 80.8

4/1/2016 3 75.9 77.6 76 78

4/15/2016 4 73.3 78.8 73.1 79.8

Subject 5

3/15/2016 1 75.7 73.4 76.2 73.9

3/22/2016 2 75.4 77.6 75.4 78.1

4/14/2016 3 75.1 74.9 74.8 74.8

Subject 6

3/16/2016 1 74.0 70.6 74.9 70.4

4/1/2016 2 78.9 70.7 79.2 71.5

4/8/2016 3 69.9 76.6 69.9 76.3

4/15/2016 4 76.5 80.2 76.9 80.5

2nd Training

Cross

Validation

Test

Accuracy

Training

Set

Accuracy

Page 9: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 9 of 13

Table 3: The overall result for the training session.

There was another 10-minute meditation at the end of the training. The purpose of this

meditation is to enhance learning and training. The subjects were rewarded with a chocolate

candy when they perform better than their previous session.

Date Session # Left Right

Subject 1

10/30/2015 1 69.0 73.0

11/6/2015 2 81.0 76.0

11/7/2015 3 65.0 66.0

11/28/2015 4 96.0 96.0

1/8/2016 5 92.0 91.0

Subject 2

12/16/2015 1 76.0 78.0

1/11/2016 2 84.0 86.0

1/14/2016 3 86.0 97.0

Subject 3

1/6/2016 1 93.0 89.0

1/13/2016 2 80.0 71.0

Subject 4

3/11/2016 1 83.0 85.0

3/18/2016 2 73.0 81.0

4/1/2016 3 77.0 76.0

4/15/2016 4 73.0 79.0

Subject 5

3/15/2016 1 77.0 74.0

3/22/2016 2 74.0 75.0

4/14/2016 3 76.0 74.0

Subject 6

3/16/2016 1 75.0 70.0

4/1/2016 2 79.0 70.0

4/8/2016 3 67.0 76.0

4/15/2016 4 76.0 81.0

Replay

Page 10: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 10 of 13

Figure 5: Subject 1 and 2’s results from the mi-csp-5-replay.xml step.

Figure 6: Subject 3 and 4’s results from the mi-csp-5-replay.xml step.

0.0

10.0

20.0

30.0

40.0

50.0

60.0

70.0

80.0

90.0

100.0

10/30/2015 11/6/2015 11/7/2015 11/28/2015 1/8/2016

Subject 1

Left Right

0.0

10.0

20.0

30.0

40.0

50.0

60.0

70.0

80.0

90.0

100.0

12/16/2015 1/11/2016 1/14/2016

Subject 2

Left Right

0.0

10.0

20.0

30.0

40.0

50.0

60.0

70.0

80.0

90.0

100.0

12/16/2015 1/11/2016

Subject 3

Left Right

0.0

10.0

20.0

30.0

40.0

50.0

60.0

70.0

80.0

90.0

100.0

3/11/2016 3/18/2016 4/1/2016 4/15/2016

Subject 4

Left Right

Page 11: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 11 of 13

Figure 7: Subject 5 and 6’s results from the mi-csp-5-replay.xml step.

0.0

10.0

20.0

30.0

40.0

50.0

60.0

70.0

80.0

90.0

100.0

3/15/2016 3/22/2016 4/14/2016

Subject 5

Left Right

0.0

10.0

20.0

30.0

40.0

50.0

60.0

70.0

80.0

90.0

100.0

3/16/2016 4/1/2016 4/8/2016 4/15/2016

Subject 6

Left Right

Page 12: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 12 of 13

4 Discussion

The purpose of this study was to use OPENViBE for signal processing of the Emotiv-

EasyCap EEG system. Previously, the signal processing for two cognitive commands using

Emotiv SDK software was not successful. The OPENViBE software was able to discriminate

two cognitive commands successfully with 65% and above accuracy for six subjects.

Even though, the number of training sessions was few, at most 5 sessions, it seems that the

OPENViBE software is able to process the signals well. However, the number of trainings

needed in the BCI literature is 20 to 50 sessions of training, each at least 30 minutes, for

optimal training [4]. It could be that with lots of training the subject can execute the two

cognitive commands significantly better, perhaps with above 90% accuracy. Subject 1 was

performing above 90% with four training sessions.

For the follow up study, it is proper to see how the subjects perform maneuvering the

powered wheelchair with the Emotiv-EasyCap OPENViBE system. Also, a program in

LabVIEW, a VI, is being written to communicate between OPENViBE software and the smart

wheelchair’s central command program. We will be using the TCP/IP Ethernet method to send

commands to LabVIEW central VI program. The OPENViBE has a “TCP write” programming

block that we are using as a server to send the command to a specific port. Then our TCP client

VI has “TCP Read” block that reads the command at the specific port.

Once we have successfully demonstrated successful maneuvering of the smart wheelchair

with two cognitive commands, we can add additional third and fourth commands using feet

motor imagery. For example, tapping the right foot to move forward and tapping the left foot to

stop.

Page 13: Smart wheelchair - Emotiv-EasyCap-OPENViBE - Final Report - 6-14-16

OPENViBE Signal Processing of Emotiv-EasyCap System for

Hand motor imagery Scenario Study

Page 13 of 13

6. References

[1] S. Debener, Manual - Emotiv EMOC step-by-step, Herrsching, Germany: EasyCap GmbH.

[2] Ibonet, "Motor Imgaery BCI with Common Spatial Pattern Filter," OPENViBE, 31- 9- 2011.

[Online]. Available: http://openvibe.inria.fr/motor-imagery-bci-with-common-spatial-pattern-

filter/. [Accessed 10- 2015].

[3] A. F. C. C. I. F. M. B. A. Z. Pavel Bobrov, "Brain-computer interface based on generation of

visual images," PLoS ONE, vol. 6, no. 6, 2011.

[4] E. C. Leuthardt, "The Evolution of Brain-Computer Interfaces," The Bridge on Frontiers of

Engineering, vol. 42, no. 1, pp. 41-50, 2012.