mie presentation

26
Section 1 Touch Your Vision MULTI-TOUCH INTERACTIVE SCREEN Faculty of computer and information science Ain Shams University

Upload: hajermohammed

Post on 28-Nov-2014

1.752 views

Category:

Education


0 download

DESCRIPTION

Interactive Screen is a Graduation Project for the 2010 year, Ain Shams University, It is extended project of Interactive Wall 2009 Interactive Screen won in a competition called MIE (Made In Egypt) Organized by IEEE, and The project did take TOP ONE rating. For Any Details Please Contact:[email protected]@[email protected]@gmail.com

TRANSCRIPT

Page 1: Mie presentation

Section 1

Touch Your Vision

MULTI-TOUCH INTERACTIVE

SCREEN

Faculty of computer and information science Ain Shams University

Page 2: Mie presentation

Supervisors:Professor Dr. Mohammed RoushdyFaculty of computer and information sciences

Dr. Haythem El-MessiryFaculty of computer and information sciences

T.A. Ahmad SalahFaculty of computer and information sciences

Sponsors:

Page 3: Mie presentation

Teamwork: Abu-Bakr Taha Abdel Khalek

Hadeel Mahmoud Mohammed

Hager Abdel Motaal Mohammed

Mahmoud Fayez El-Khateeb

Yasmeen Abdel Naby Aly

Page 4: Mie presentation

Agenda:1. Introduction.2. Interactive Screen Vs Other systems.3. Market Research &Customer needs.4. Physical Environment.5. System Framework.6. System Modules.7. Applications.8. Limitations.9. Future Work.10. References.

Page 5: Mie presentation

Overview:Projector and two cams system HCI (Human Computer Interaction) system.Interact with hand gestures(shapes).Extension of interactive wall ‘09

Introduction: Problem Definition: Human interacts

normally with another human by using motions, It might be annoying and impractical to use hardware equipments to interact with someone/something.

Page 6: Mie presentation

Introduction: Motivation:

1. “Interactive Wall 2009”.

2. multi touch

technology

3. large size of touch screen with appropriate cost.

4. flexibility.

Page 7: Mie presentation

future without annoying input devices, and to be proud of being a part of accomplishing such a dream.

develop the Interactive Screen System to be able to handle more features and gestures and to get over its limitation which will make the user get satisfied with its usability and flexibility of use.

Page 8: Mie presentation

Time Plan:

Milestones: . Segmentation modules. May-2010 • Multi-hand tracking. April-2010• Automatic hand detection. April-2010 • Z-Depth Module. April-2010• Dynamic gesture Module. May-2010

Page 9: Mie presentation

Physical Environment:

• Simple components constructs a new environment of interactive screens that overcomes limitations of other systems.

•Environment limitations

Traditional Environment

Page 10: Mie presentation

Camera

Mirror

Z-Depth Camera

Interactive Screen

Projector

Proposed Physical Environment:

Page 11: Mie presentation

Physical Environment:

other alternative solutions VS proposed solution

Page 12: Mie presentation

Interactive screen Vs other Systems

Microsoft surface.

Diamond Touch.

Touch Screens.

• Cost• No need To touch the screen.• Gesture Recognition.• Dynamic Gesture Recognition.• Bare Hands.• No sensors, pure image

processing.

Page 13: Mie presentation

• In 1991, First Smart White Board.

• Over 1.6 million smart whiteboards have been installed throughout the world.

• Surveys indicates that interactive whiteboards benefit student engagement, learner motivation and knowledge retention.

Market Research

Page 14: Mie presentation

Framework

System Controlling

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 15: Mie presentation

Experimental Results

Input apply a geometric calibration using the four calibration points acquired

by the configuration module. Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 16: Mie presentation

Experimental ResultsSimple back ground

Complex back ground

It’s main task is to generate a binary image from the captured

image represents the foreground

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 17: Mie presentation

Experimental Results

responsible for detecting the hand position automatically at any position,

with certain gesture (Open Hand)

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 18: Mie presentation

Experimental Results

responsible for keep track with the user hand, know the actual position of the hand all the time

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 19: Mie presentation

Experimental Results

the main task is to decide whether the user did touch the

screen or not

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 20: Mie presentation

Experimental Results

Responsible for recognize the shape of the hand.

•Static.•Dynamic.

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

1 Main Finger

2 Main Fingers

Open Hand Closed Hand 2 Fingers0

50

100

150

Page 21: Mie presentation

Smart whiteboard application

• In class rooms.• In meeting rooms.

Application

Page 22: Mie presentation

Limitations

• Non Skin color for the user top clothes.

• user must wear Long Sleeves.

•Enter the system with certain gesture.

Page 23: Mie presentation

1. Multi-user system.

2. Body Tracking.

3. Detecting any shape of hand.

Future work

Page 24: Mie presentation

[1]. Mennat-Allah Mostafa Mohammad,Nada Sherif Abd El Galeel,Rana Mohammad Ali Roshdy,Sarah Ismail Ibrahim, Multi Touch Interactive Surface, Faculty of Computer and Information Sciences, Ain shams University, Cairo, Egypt, 2009.

[2]. Kai Briechle, Uwe D. Hanebeck, Template Matching using Fast Normalized Cross Correlation, Institute of Automatic Control Engineering,Technische Universität München, 80290 München, Germany.,2001.

[3]. Rafeal C. Gonzalez, Richard E. Woods, DIGITAL IMAGE PROCESSING, Third edition, Pearson, 2008.

[4]. Gray Bradski, Adrian kaebler, Learning Open CV, O'Reilly Media, 2008.

[5]. Alan M. McIvor, Background Subtraction Techniques In Proc. of Image and Vision Computing, Auckland, New Zealand, 2000.

[6]. Francesca Gasparini, Raimondo Schettini , Skin Segmentation using multiple thresholding, Milano Italy, 2007.

References

Page 25: Mie presentation

[7]. Hideki Koike, Masataka Toyoura, Kenji Oka and Yoichi Sato, 3-D Interaction with Wall-Sized Display, IEEE computer society,2008.

[8]. Mahdi Mirzabaki , A New Method for Depth Detection Using Interpolation Functions using single camera, INTERNATIONAL ARCHIVES OF PHOTOGRAMMETRY REMOTE SENSING AND SPATIAL INFORMATION SCIENCES, 2004, VOL 35; PART 3, pages 724-726.

[9]. Patrick Horain and Mayank Bomb, 3D Model Based Gesture Acquisition Using a Single Camera,proceedings of sixth IEEE on applications of computer vision ,2002.

[10] Z. Cˇernekova´, N. Nikolaidis and I. Pitas, SINGLE CAMERA POINTING GESTURE RECOGNITION USING SPATIALFEATURES AND SUPPORT VECTOR MACHINES,EUSIPCO,Pozan,2007.

References

Page 26: Mie presentation

Thank You

AnyQuestions