eye tracking device for ikids

15
Eye Tracking Device for IKIDS Project Proposal ECE445: Senior Design Qiong Chen, Arsh Sigh, Qiuyuan Zhang TA: Igor Fedorov Date: February 10, 2014

Upload: aakash-sheelvant

Post on 22-Dec-2015

3 views

Category:

Documents


0 download

DESCRIPTION

ghfh

TRANSCRIPT

Page 1: Eye Tracking Device for IKIDS

Eye Tracking Device for IKIDS

Project Proposal

ECE445: Senior Design

Qiong Chen, Arsh Sigh, Qiuyuan Zhang

TA: Igor Fedorov

Date: February 10, 2014

Page 2: Eye Tracking Device for IKIDS

Table of Contents

1. Introduction 1.1 Project Overview

1.2 Objectives

1.2.1 Goals

1.2.2 Features

1.2.3 Benefits

2. Design 2.1 Block Diagram

2.2 Block Descriptions

3. Requirements and Verifications 3.1 Requirements

3.2 Verification

3.3 Tolerance Analysis

4. Project Cost and Schedule 4.1 Cost

4.1.1 Labor Cost

4.1.2 Part Cost

4.1.3 Total

4.2 Schedule

Page 3: Eye Tracking Device for IKIDS

1. Introduction 1.1 Project Overview

Title: Eye Tracking Device for IKIDS

IKIDS (Illinois Kids Development Study) from University of Illinois is studying the relation

between chemical Bisphenol A (BPA) intake and infant’s health. At very beginning,

pregnant women are recruited at Ob-Gyn Clinics in early pregnancy and urine samples are

collected through pregnancy to measure exposure to chemicals. Then researchers will access

babies’ memory and attention during 18 – 48 hours after birth at hospital. On this stage of

research, 500 newborns will be assessed between June 2014 and June 2016. Children’s

cognition will also be assessed at IKIDS lab at ages of 4 months, 7 months, 3 years and 5

years. In order to successfully access babies’ memory and attention, IKIDS lab hopes we

ECE students to design a new eye tracking system.

1.2 Objectives

1.2.1 Goals:

Our goal is to build a relative mobile unit that can be used at Carle Hospital to assess the

memory and attention of newborn infants. This project consists of a webcam which captures

baby’s facial image, a calibration module which is used to attracts baby’s attention, two

monitors which show testing images and a software which processes the eye tracking

algorithm and synchronizes and controls the whole system.

1.2.2 Features:

The part that our group would like to focus on would be the real time processing of camera

images in order to guide the researcher and the newborn during the research process. The

finished device should have the following features:

(1) Display a series of pictures to an individual infant using two monitor screens

(2) Calibration to determine baby’s head position

(3) Videotape the infant’s face during the presentation

(4) Track eye movements to determine where the infant looks and for how long

Page 4: Eye Tracking Device for IKIDS

(5) User Interface to allow researcher to control the whole process in a coordinated manner

1.2.3 Benefits

(1) Making the unit mobile so that it can be moved to different areas of the hospital where

newborns will be located;

(2) Adding an eye tracker device to more reliably and precisely measure the direction and

duration of infants’ looking responses (commercially available eye trackers are prohibitively

expensive at more than $30k);

(3) Adding features to the unit that will allow the infant to be positioned correctly in front of

the display, and will block extraneous stimuli from the infant’s view;

(4) Creating a user-friendly interface that will allow the researcher to control the computer,

the camera, and the eye tracker remotely and in a coordinated manner.

Page 5: Eye Tracking Device for IKIDS

2. Design 2.1 Block Diagram

Monitor Left

& right

Web-Cam

AC Power Outlet

Monitor for

user control

Microphone

Speaker

Recorder IC

USB-UART

Converter

Master

Micro-

controller

AC-9V

Adapter

DC-DC

Converter

Relays for

output

5v_3.3V Power Unit

Slave

Micro-

Controller

LCD-

Display

LCD Display Module

Recorder Playback Module

Master Controller Module

Controller

User Interface

Analysis

Eye Tracking

Computer

Page 6: Eye Tracking Device for IKIDS

2.2 Block Descriptions

2.2.1 Master Controller Module

The ATMEGA328P microcontroller will be used to control the Record and Play-back Module,

LCD display, and Web-Cam. Command from the computer will be send in UART protocol to

control each sub-module. At calibration phase, microcontroller will turn on Web-Cam, generate

colorful pattern on LCD display as well as a pre-recorded message to speakers to attract babies’

attention. At the start of each trial, the speaker will also send alarm through speaker to signal the

beginning of a new trial. This Microcontroller will be powered by a 5v source out of the 5v-

Power-Unit. This microprocessor will also be used to control power on and off the LCD display,

Web-Cam and Record and Play-back Module by outputting a signal through relay.

2.2.2 5V-3.3V Power Module

In order to provide enough current for speaker module, LCD display module, and Web-Cam

power control module, we designed our own power source. With AC power outlet of 120V, the

circuit will include AC-DC converter to step it down to 4 power outputs, each 5V DC. In

addition there will be a DC-DC converter to have a stable output at 3.3V DC as well for the LCD

display. Each output will be controlled with a relay with signals from Master Microcontroller to

power on and off of each individual output.

2.2.3 Record and Play-back Module

Our speaker module will include a Recorder IC, a microphone, and a speaker. The ISD1700

recorder IC chip with capability of recording two messages, each of up to 40 seconds will be

used. Microphone will be used to record desired message from researcher. One message will be

used to attract baby’s attention at calibration phase while the other will indicate the start of a new

trial. Message will be selected by ATMEGA328P microcontroller, speaker’s data line will be

connected to the outputs of this IC. Both speaker and the recorder IC will be powered by the 5V-

Power-Unit.

2.2.4 Web Camera

The Web Camera will be used for capturing eye movements of the testing subject. During real

time process, images of the babies’ eyes will be captures and feed into the processing algorithm

to determine the position of their eyes. The Camera will be turned on and off according to

microprocessor’s command. The Camera will be powered by the 5V-Power-Unit.

Page 7: Eye Tracking Device for IKIDS

2.2.5 LCD Module

The purpose of the LCD module is to generate colorful patterns that attract baby’s attention.

This module will include a RGB LCD powered by 3.3v-Power-unit and a slave ATMEGA328P

Microcontroller with encoded patterns for controlling the display. The LCD display will be

controlled by Microprocessor with SPI command protocol. It will be positioned in between two

monitors, just below the video camera.

2.2.6 Monitors Left and Right

These monitors are provided by the IKIDS lab and are used for both calibration and research

testing for babies. In the calibration phase, monitors will display stimuli, geometric pictures to

center the baby’s attention at the center of each monitor one after the other to gather initial

relationship between baby’s eye movement and their position of focus on monitor. During

research testing, two faces in black and white will be displayed on each one.

2.2.7 Monitor for User Control

This monitor will be the main user interface for the researcher. It will display the video captured

by the camera together with a cross on top to help positioning baby at the optimal position for

data gathering. In addition it will display the result of eye tracking with heat map overlaying the

two testing images. The processed data of baby looking at left or right will be shown as well.

There will also be various buttons customized for user control.

2.2.8 Computer

Eye tracking will be done using OpenCV library while the entire software will be done using

Marvin in Eclipse IDE. The processor will take in frames of videos from the camera, command

from the researcher, store them, process the data, generate data logs, output testing images to

monitor on left and right, stimuli signals to LED module and speaker, processed eye positions on

user-control monitor.

Page 8: Eye Tracking Device for IKIDS

2.3 Logic Flow Chart

NO

YES

Calibration

Display Images on

Left and Right

Monitors

Start sending video to

Eye Tracking Program

Image

Processing

Data

Logging

While it isn’t

interrupted

Page 9: Eye Tracking Device for IKIDS

3. Requirements and Verifications 3.1 Requirements and Verification

Requirements Verification Video Camera Records the baby for the session and transmits the

video stream to the computer with less than a 50ms

delay.

Open the stream or file that the camera should be

outputting to and check that the video is coming

through. Then, test the delay by moving something in

front of the camera and timing how long it takes to show up on video.

LCD Display Module Flashes to attract the baby’s attention to the center of

the testing area to begin the test with a 95% success

rate.

When the module receives a signal via the USB

connection it should flash. The power should be

sufficiently drawn from the USB outlet. The success

rate will be calculated based on if the baby actually

looks.

Recorder Playback Module The speaker should make a noise at the beginning of the test to attract the baby’s attention. It is also used to

help the researchers coordinate their actions and notify

them at important times.

The speaker should be able to beep on receiving a

signal from the program. This testing should be done

by setting up a simple program that just sends pulses

to the speaker and ensuring that it emits sound on cue.

Master Controller Module All signals from the computer programs actually come

here. This distributes them to the correct component

with 100% accuracy.

To test this, it could be connected to an LED for each

component it should connect to and then when it is

sent a signal, it should light up the LED that

corresponds to the appropriate component.

Monitors (Left and Right) These monitors should display the testing images to

the baby in the pattern, sequence, and for the time

defined by the controller.

This should be tested by sending the monitors signals

from the controller program. The monitors should display the images in the correct positions and with the

right orientation.

Eye tracking program The program needs to be able to take the input from

the video camera, run it through the eye tracking

software. It should be able to detect and follow the

eyes 85% of time. The program should also notify the

researchers if the eyes close or otherwise stop

appearing to the video.

The analysis capability of the computer should be

tested by feeding a video into it and running the eye

tracking software on it. We should be able to check

how often it detects the eyes correctly by watching it

track the eyes. For the last part, the testing is as simple as closing your eyes once the program has locked onto

your eyes.

Computer (Analysis) The program should also be able to detect the direction

in which the eyes are looking within a 10 degree range

This should be tested using a person with full

cognitive ability. We run the calibration module on

them and then track their eyes. They should be able to

tell us exactly where they are looking and we can

compare that to the position that the program believes

they are looking at.

Computer (Controller) The computer should be able to send output signals to

the Left and Right monitors as well as the LCD

module and speaker.

This can be tested by sending output to each of the mentioned devices. They should respond correctly to

the signal.

Computer (UI) The user should see a window which displays: 1) The

images on the monitors the baby sees, 2) Video of the

baby with eyes being tracked, 3) Timer for length of

time baby has looked at a given side, 4) Buttons for

interacting with the program. These tools should work

in 98% of cases.

This can be tested by starting the program and on the

researcher’s monitor, they should be able to see the

aforementioned interface objects. They should also be

able to use the buttons to calibrate, start, stop, and

restart the session, and to download and replay previously captured streams of sessions.

Page 10: Eye Tracking Device for IKIDS

3.2 Tolerance Analysis

For real time image processing we would like to achieve about 80% correctness on determining

whether the baby is looking at the picture on the left or on the right.

For hardware control we would like to power the LED module with 5v from USB port. We

would like to regulate voltage level ± 0.2 V.

Page 11: Eye Tracking Device for IKIDS

4. Project Cost and Schedule

4.1 Cost

4.1.1 Labor Cost

Name Rate/Hour

($) Hrs/Week weeks Total(salary*2.5*hours)

Joy 40 15 12 18000

Arsh 40 15 12 18000

Sophie 40 15 12 18000

Total 54000

4.1.2 Part Cost

Item Part Number Quantity Unit Cost Total

TFT LCD display ST7735R 2 19.95 39.9

ATMEGA328P DEV-10524 2 5.5 11

USB to UART Bridge Breakout Board BOB-00718 2 14.95 29.9

DC/DC Converter BOB-09370 2 29.95 59.9

9V DC 650mA Wall Adapter RTL-10273 1 6.95 6.95

Electret Microphone Amplifier MAX4466 1 6.95 6.95

8 Ohm speaker COM-09151 2 1.95 3.9

Relay SPDT Sealed COM-00100 5 1.95 9.75

Crystal 16MHz COM-00536 2 0.95 1.9

IC VOIC REC/PLAY ISD1700 2 7.2 14.4

Push Button COM-09190 1 0.5 0.5

TOTAL 185.1

4.1.3 Total

Labor Cost Parts Cost Grand Total($)

54000 185.1 54185.1.7

Page 12: Eye Tracking Device for IKIDS

4.2 Schedule

Week Joy Sophie Arsh

10-Feb

Work on proposal,

download and have first

experience with Marvin

in Eclipse IDE

Work on proposal;

research on USB to

UART design and port

protocol; Determine

Part for Power Module

and send out order

Work on proposal,

download and initialize

User Interface

17-Feb Use OpenCV to process

images

Prototype Power-

Module, draw

schematic and PCB

layout; Order parts for

all modules

Develop eye tracking

algorithm

24-Feb Improve the algorithm

by testing

Prototype Recording

Module, draw

schematic and PCB

layout

Develop eye tracking

algorithm

3-Mar

Develop plug-in for

Marvin to show

processed facial image

Prototype LCD display

Module, draw

schematic and PCB

layout

Explore OpenCV API

10-Mar

Develop plug-in for

Marvin to show

processed facial image

Link Power Module

with Recording Module

and LCD display

Module with Master

microcontroller on

bread board

Save processed image

file to local folder

17-Mar

Add buttons for

different sections; Add

button for calibration

command; Process the

calibration position and

feed data to eye

tracking algorithm

Prototype USB to

Microcontroller, and

USB driver protocal

User control the display

on monitors

RANDOMLY; Capture

images from real time

video

3/24/2014

(spring

break)

Software can control

images shown on

monitors; Show

positions that baby look

at on the monitor’s

image (Mapping)

Work on USB driver

programing

Images shown on

monitors will display in

software

31-Mar Generate logs based on

mapping result

Finish PCB layout for

Master Microcontroller

Unit

Show logs through

interface; Save

interface

Page 13: Eye Tracking Device for IKIDS

7-Apr

Identify gaze beyond

boundary; Notify user

through interface by a

text box; User can skip

to next trail section

Link Master

Microcontroller Unit

with LCD display,

Recoding, and Power

Module, test and debug,

send final PCB design

Develop closed-eye

detection algorithm

through OpenCV

14-Apr Continue testing and

debugging the system

Receive, debug and

finish PCB, integrate it

with Software

Integrate every pieces

together; Integrate with

hardware

21-Apr

Client testing and

feedback; Prepare for

demo and presentation

Client testing and

feedback; Prepare for

demo and presentation

Client testing and

feedback; Prepare for

demo and presentation

28-Apr Final Paper and check

out

Final Paper and check

out

Final Paper and check

out

This project evolves heavy software development. We decided to adopt agile programing

mythology. We divide all functionalities to separate user stories and also split user stories to

minor tasks. We also assign the approximated working time for each task and the person who is

responsible for each task.

User Stories & Tasks & Iterations:

Week 10-Feb

1. User can start/interrupt/reset video camera through software

Hours/Person Task Lead

4 Install windows system and Marvin framework Joy

8 Add start/interrupt/reset button to Marvin Arsh

6 Able to display video camera image on software Joy

Week 17-Feb to 24-Feb

2. User can view baby’s image with highlighted eye area

Hours/Person Task Lead

10 Use OpenCV to process images Joy

16 Develop eye tracking algorithm Arsh

6 Improve the algorithm by testing Joy

Page 14: Eye Tracking Device for IKIDS

Week 3-Mar to 10-Mar

3. User can view baby’s image with highlighted eye area through software interface/save

Hours Task Lead

10 Explore OpenCV API Arsh

40 Develop plug-in for Marvin to show processed facial

image

Joy

6 Save processed image file to local folder Arsh

Week 17-Mar

4. User can choose between different trial sections

Hours Task Lead

2 Add buttons for different sections Joy

2 User control the display on monitors RANDOMLY Arsh

5. User can click on button to set calibration positions (L,M,R) interacting with real-time

camera image

Week 24-Mar (Spring Break)

6. User can view pictures of two monitors through software interface

Hours Task Lead

8 Software can control images shown on monitors Joy

10 Images shown on monitors will display in software Arsh

4 Show positions that baby look at on the monitor’s image

(Mapping)

Joy

Week 31-Mar

7. User can view generated logs/through software interface/save

Hours Task Lead

8 Generate logs based on mapping result Joy

6 Show logs through interface Arsh

2 Save interface Arsh

Hours Task Lead

2 Add button for calibration command Joy

8 Capture images from real time video Arsh

4 Process the calibration position and feed data to eye

tracking algorithm

Joy

Page 15: Eye Tracking Device for IKIDS

Week 7-Apr

8. User can be notified if eyes are closed or lost (additional function)

Hours Task Lead

12 Develop closed-eye detection algorithm through OpenCV Arsh

2 Identify gaze beyond boundary Joy

4 Notify user through interface by a text box Joy

2 User can skip to next trail section Joy

Week 14-Apr

9. User can use the program for the whole research process

Hours Task Lead

20 Integrate every pieces together Joy